Finding Interesting Fediverse Content



I’ve been running a single-user instance Mastodon server for a few months. Being a single-user instance, it can be fairly quiet at times and one of the aspects I’ve been exploring is finding and adding targeted content to the Federated timeline…

Mastondon supports searching using hashtags and we can use the local API to add these to the local Federated timeline with the ‘v2/search?resolve=true&q=’ end point. I started with this approach but the problem is that it seems to be optimised for the UI, is a synchronous call, and essentially tells the server to get this post now. This works for a very low volume of searches but can quickly overwhelm the server…

Luckily there is a solution: FakeRelay. FakeRelay is a tool for Mastodon admins to load statuses into their instances. More importantly, FakeRelay is NOT:

Self-hosting a FakeRelay search bot

To get started with FakeRelay, you can ask the operator for an api key or you can host it yourself. I’m self hosting behind traefik.

FakeRelay docker-compose.yml


version: '2'
services:
  fakerelay:
    image: 'ghcr.io/g3rv4/fakerelay:latest'
    command: web
    hostname: fakerelay
    environment:
      - ASPNETCORE_ENVIRONMENT=Production
      - ASPNETCORE_URLS=http://+:5000
      - CONFIG_PATH=/data/config.json
    restart: always
    volumes:
      - './data:/data'
    labels:
      - "traefik.enable=true"
      - "traefik.http.routers.fakerelay.rule=Host(`relay.beyondwatts.social`)"
      - "traefik.http.routers.fakerelay.entrypoints=websecure"
      - "traefik.http.services.fakerelay.loadbalancer.server.port=5000"
      - "traefik.docker.network=proxy"
    networks:
      - proxy
  cli:
    image: 'ghcr.io/g3rv4/fakerelay:latest'
    volumes:
      - './data:/data'
    networks:
      - proxy

networks:
  proxy:
   external: true

Finding content

FakeRelay doesn’t do any more than help get the content onto your Mastoon server. To tell it which content you are interested in, we need a script. Here’s a nodejs script to run every hour and pull posts with certain hastags from the a defined set of hosts onto my own Mastodon instance at https://beyondwatts.social


const axios			= require('axios');
const Mastodon	= require('mastodon');
const log				= require('npmlog');

require('dotenv').config();

log.level = process.env.LOG_LEVEL || 'verbose';

const config = {
	access_token: process.env.ACCESS_TOKEN,
	api_url: `https://${process.env.API_HOST}/api/`,
	relay_host: process.env.RELAY_HOST,
	fake_relay_access_token: process.env.FAKE_RELAY_ACCESS_TOKEN,
	hashtag: process.env.HASHTAG,
	search_hosts: process.env.SEARCH_HOSTS
};

console.log(config);

log.info('Booting up...');
const client   = new Mastodon(config);

function setupRemoteHosts()
{
	var rH = [];
	let hosts = config.search_hosts.split(',');
	for (i = 0; i < hosts.length; i++)
	{
		hosts[i] = hosts[i].trim();
		var newClientConfig = {
			access_token: '123',
			api_url: `${hosts[i]}/api/`
		};
		var newClient = new Mastodon(newClientConfig);
		rH.push(newClient);
	}
	return rH;
}

async function localSearchForRemoteStatus(url)
{
	console.log('localSearchForRemoteStatus()');
	var data = await Promise.all([
		client.get(`v2/search?resolve=true&q=${url}`, {}).then(resp => resp.data),
	]);
	var status = data[0].statuses;
	var i =0;
	var id = status[i].id;
	console.log(`${i} - ${id}`);
	console.log(`Boosting: ${id}`);
	client.post(`v1/statuses/${id}/reblog`);
}

async function postStatusToRelay(uri)
{
	console.log(`posting to relay ${uri}`);
	const headers = {
		'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8',
		'Authorization': `Bearer ${config.fake_relay_access_token}`
	}
	axios.post(`https://${config.relay_host}/index`,{
		statusUrl: uri
		}, {
			headers
		})
		.then(response => {
  	  console.log(`posted to relay: ${uri}`);
		})
		.catch((error) => {
			console.error(error)
		})
}

async function doRemoteSearch(tagName, remoteHost)
{
	var maxResultsToDisplay = 20;
	console.log(remoteHost);
	console.log(`doRemoteSearch()`);
	console.log(`Looking for ${tagName} on ${remoteHost.config.api_url}`);
	var data = await Promise.all([
		remoteHost.get(`v1/timelines/tag/${tagName}`, {}).then(resp => resp.data),
	]);
	var status = data[0];
	var numResults = status.length;
	if (numResults < maxResultsToDisplay)
	{
		maxResultsToDisplay = numResults;
	}
	for (var i=0; i < maxResultsToDisplay; i++)
	{
		var index = maxResultsToDisplay - i - 1;
		console.log(`${index} - ${status[index].url}`);
		// await localSearchForRemoteStatus(status[index].url);
		await postStatusToRelay(status[index].uri);
	}
}

async function kickOffRemoteSearch(rH)
{
	let hashtag = config.hashtag.split(',');
	for (i = 0; i < hashtag.length; i++)
	{
		hashtag[i] = hashtag[i].trim();
		for (j = 0; j < remoteHosts.length; j++)
		{
			await doRemoteSearch(hashtag[i], remoteHosts[j]);
		}
	}
}

var remoteHosts = setupRemoteHosts();
kickOffRemoteSearch();
setInterval(kickOffRemoteSearch, 1 * 1000 * 60 * 60); // Run every 60mins

To use the script, in addition to nodejs and the index.js file above, you’ll need to populate the following .env file:


RELAY_HOST=
FAKE_RELAY_ACCESS_TOKEN=
HASHTAG=hashtag1, hashtag2, etc

SEARCH_HOSTS=https://mapstodon.space, https://mastodon.art, https://mastodon.online, https://mastodon.social, etc

Comparing the performance of FakeRelay and the Mastodon API endpoint

If you are interested in comparing the performance of FakeRelay and the built-in Mastodon API endpoint you can add the following lines to the .env file

ACCESS_TOKEN=
API_HOST=

and change

		// await localSearchForRemoteStatus(status[index].url);
		await postStatusToRelay(status[index].uri);

to

		await localSearchForRemoteStatus(status[index].url);
		// await postStatusToRelay(status[index].uri);

Running the nodejs search-bot in docker

We can create a simple nodejs container with the following Dockerfile and docker-compose.yml

# Dockerfile
# https://mherman.org/blog/dockerizing-a-react-app/
# build environment

FROM node:16-alpine as build

WORKDIR /app

ENV PATH /app/node_modules/.bin:$PATH

COPY package.json ./
COPY . ./
RUN yarn install

CMD ["yarn", "start"]
# docker-compose.yml

version: '3.7'

services:

  prod:
    container_name: search-bot
    build:
      context: .
      dockerfile: Dockerfile
    networks:
      - proxy
      - internal
    labels:
      - "traefik.enable=false"
    restart: unless-stopped

networks:
  proxy:
    external: true
  internal: