If you haven’t seen FactorDaily’s Curated, I recommend you check it out before reading this post. Curated, which is still in early beta, is our attempt to use chat bots to push the boundaries of news and content online. We’re doing this because bots are becoming increasingly common in digital newsrooms in 2016, both as an internal tool, and also as a way to intelligently engage with an audience.
Here at FactorDaily, we are knowledge nerds. At any given point in time, we are up to our eyeballs in articles and videos, and the more we read and watch, the more we want to share.
At FactorDaily, our tool of choice to do this is Slack, the instant-messaging software that’s taking the world by storm. Slack is awesome for sharing, but since everyone uses it constantly, things can get messy. Going back and finding something, or keeping track of who is sharing what, can be tough.
Enter Regina, our homegrown Slackbot. Regina sits in a dedicated channel within the FactorDaily Slack and listens to all conversations. Each time she sees a link to a video or an article, Regina saves it in a database and exposes it as an API for search.
If you aren’t into coding, what follows might be, well, slightly hardcore.
Building the bot
I used this excellent tutorial on Scotch.io to get started on my bot. I code-named it Regina after the evil queen in the the amazing ABC series, Once Upon A Time (it’s a long story), and extended it with a few small features including one that makes her crack Star Wars jokes.
Time spent: 90 minutes
Look Ma, no backend
So what happens when Regina detects a link? She needs to not only store it but also save other information like who shared it, in what channel, and detect what kind of article it is.
This was supposed to a quick hack and an internal tool at best. I didn’t want to get into this business of writing a backend which would connect to a database and save the links. However, I wanted it to be a production quality hack, just in case we extended it in future.
I created an app in Stamplay. Then I created a model called knowledge-object. Each knowledge-object had title, content, URL, thumbnail, tags, channel, and sender. I like to view all my data in this manner, as it gives me flexibility to add various parameters and also the certainty of what those parameters would be.
Modelling the information around us as knowledge-objects has proven to be a very extensible and reliable thing for me. But that’s another post for another time.
Long story short, Regina now has a REST-based API that lets her save links each time they are posted on Slack.
Time spent: 15 minutes
Tagging at the speed of thought
Having Regina parse content within a link and automatically generate tags for it was an important feature to have. How do you do this without any human involvement? You use the AlchemyAPI. Alchemy is an excellent API for quick AI analysis on your content, and finding out subject, author, tags, and more. It has a nifty Node.js code sample that you can rapidly repurpose into an API.
I created an API endpoint that received URLs and retuned a JSON structure with tags and other metadata.
Time spent: 2 hours
Stamplay is awesome because it allows you to wire a complex application simply by wiring individual pieces together and setting tasks based on triggers.
In any normal development environment, this would take hours of code-test-build-code cycles. Here, I simply did the following:
- I wrote a small codeblock, that would receive an ID and URl, and hit the Alchemy API that I mentioned above. Once it got the response, it would send a post update to the database with the tags and other metadata.
This allowed me to keep everything decoupled and independent of each other. If my Alchemy API went down for some reason, it still meant that my bot could keep saving the links into the Stamplay backend.
Time spent: 1 hour
The age of smart bots
To make my bot look smart, I did one final hack on Stamplay. I set up a reverse trigger that would post back the tags from Alchemy into the Slack channel. This was interesting, because it blew away our team when Regina actually replied to them like a real human about what the link they just posted was about.
Time spent: 15 minutes
Finally, what good is all this data if it isn’t exposed in a nice UI on the frontend? The idea is for people to be able to search FactorDaily Curated based on which FactorDaily writer posted a story, tags, topics, or see a timeline of what was posted when.
Stamplay exposes all objects in a very nice REST-based API that can be queried by parameters. It has in-built pagination, and the structure is JSON.
This makes writing a simple frontend in AngularJS or any other technology a piece of cake. All one needs to do is consume this API on the frontend and render it as HTML.
Thus, you can have a fairly complex, end-to-end solution by wiring existing technologies and a service like Stamplay at the backend. It allows you to get from idea to MVP in over a few hours (literally) with very little investment in code.
You can run this in a production environment, observe user behaviour, fine tune your product, raise money — literally anything you want.
The age of bots and backend as a service has truly arrived, and at FactorDaily, we’re just getting started.
We’ll keep exploring new ways of using bots to make life simpler and more fun for both the FactorDaily newsroom and our readers in the future. If you’re a nerd and can actually code, check out the code and improve or modify it here and here. This code is in public domain and given on an as is basis.