This is a short guide on how to write a bot for minds.com using minds-api - an API interaction package I wrote for python recently.
For a bot you'll need couple of things:
- minds.com account.
- linux server/computer with python 3.6 or later.
- basic knowledge of python or general programming.
- idea for a bot
To start off we need an idea. An easy example is a repost bot - a bot that takes something from the internet and reposts it on minds.com.
So to start off lets make a bot that takes something from an RSS feed and posts it on your newsfeed!
Lets define logic flow of our bot:
- connect to rss feed.
- parse rss feed.
- check if anything is new on rss feed.
- format our minds.com message from rss feed data.
- post message to our newsfeed.
- save this action to history to not repeat it later.
For our bot lets use NASA's education RSS feed that can be found here:
https://www.nasa.gov/rss/dyn/educationnews.rss
And have all of these articles posted on our newsfeed!
Preparation
Lets start off with step number 0: setting up environment.
As I've said before we need two things: linux and python3.6; once you we have those we want to install the api package and another package that will help us parse RSS feeds called feedparser:
$ python --version
Python 3.6.4
$ pip install minds feedparser
That's about it!
It's also good idea to install packages under virtual environment but that's a bit beyond the scope of this guide
The code!
Steps 1 and 2 - downloading rss data and parsing it:
# nasa_bot.py
import requests
import feedparser
url = "https://www.nasa.gov/rss/dyn/educationnews.rss"
feed = feedparser.parse(url)
for entry in feed.entries:
print(entry.title)
print(entry.description)
print(entry.link)
print('------')
If you run the above script it will go through RSS feed entries and retrieve print out feed title, description, url and separator line.
Lets add step #3 and #6: store and check ids to your local database:
# nasa_bot.py
import requests
import feedparser
import os
from pathlib import Path
# step 3 goes here
HISTORY_FILENAME = Path(os.path.expanduser('~/.nasa_bot_history.txt'))
HISTORY_FILENAME.touch(exist_ok=True) # make sure history file exists
with open(HISTORY_FILENAME, 'r') as f:
history_ids = f.read().splitlines()
history_file = open(HISTORY_FILENAME, 'a')
url = "https://www.nasa.gov/rss/dyn/educationnews.rss"
feed = feedparser.parse(url)
for entry in feed.entries:
if entry.id in history_ids: # skip visited
continue
history_file.write(entry.id + '\n') # step 6 goes here - save ids to file
print(entry.title)
print(entry.description)
print(entry.link)
print(entry.id)
print('------')
history_file.close()
Great now we have a permanent local "database" that will store ids that we worked with locally so we don't repeat posting the same stuff!
If you run the script now. First time you should see a bunch of output, the second time you'll probably see nothing at all as our history file is filled with ids we have crawled.
Finally lets finish everything by posting our information to ours minds newsfeed!
# nasa_bot.py
import requests
import feedparser
import os
from pathlib import Path
from minds import Minds, Profile
USERNAME = 'your username'
PASSWORD = 'your password'
api = Minds(Profile(USERNAME, PASSWORD))
HISTORY_FILENAME = Path(os.path.expanduser('~/.nasa_bot_history.txt'))
HISTORY_FILENAME.touch(exist_ok=True) # make sure history file exists
with open(HISTORY_FILENAME, 'r') as f:
history_ids = f.read().splitlines()
history_file = open(HISTORY_FILENAME, 'a')
url = "https://www.nasa.gov/rss/dyn/educationnews.rss"
feed = feedparser.parse(url)
for entry in feed.entries:
if entry.id in history_ids: # skip visited
continue
# format our minds message: title description link and some hashtags
msg = f'{entry.title}\n\n{entry.description}\n\n{entry.link}\n#nasa #news #education'
# post the message
response = api.post_newsfeed(msg)
if response['status'] == 'success':
history_file.write(entry.id + '\n')
print(f'posted: "{msg}"')
else:
print(f'failed posting: "{response}"')
history_file.close()
There we go, this is full bot that once called will look for new NASA articles and if any are found they will be posted to your newsfeed!
Conclusion
It's pretty easy to write bots for minds.com and bots can create and automate a lot of great content.
Hopefully you learned something in this short tutorial and if you'd like to learn more about minds checkout these links:
minds-api - package for python, still in early stages.
minds-cli - package for interacting with minds (posting, reading) from your terminal! (relies on above)
dota_live - my bot that uses the same logic to post dota2 video-game's live matches whenever they start with direct link to streams.
If you wish to support me and interested in running your bot on a server check out my referral link for linode hosting
Cheers!
This blogpost was written in markdown and posted via minds-cli application
Disclaimer
Needless to say one shouldn't use bots to manipulate votes - it will most likely result in an account suspension and harm the platform in general without much benefit.
To add to that minds team asked me to point out that bots shouldn't participate in point earning system as it will devalue real user points and cause unnecessary stress on the system.
minds-api package is certainly not fit for that as it ignores javascript and cookies entirely meaning it sticks out as a sore thumb, to add to that as creator of the package I do not endorse such behaviour.
Cheers!