Mental Models API
Here we go, the first blog entry. Gestating for long, this is how I went about building an API for a topic I am particularly fascinated about.
I came across the Farnum Street blog quite accidentally while mindlessly browsing Twitter for a stray nugget of wisdom(Believe it!). The founder Shane Parrish had tweeted a quote by Charlie Munger, and the link directed to the Farnum Street website. Now, this website has a lot of articles on a variety of self-help topics, but at the centre of it all is the concept of Mental Models, and how they can be used effectively to help us overcome challenges in life, more generally how to apply logic and look at a situation from a unique perspective. There are a total of 113 different Mental Models listed on the website encompassing 8 categories. In terms of building my first API, this seemed like a great springboard to jump from.
Basic Outline
The first step to building an API for Mental Models would be gathering all the data. The handy tool for the step was good old web scraping. I wrote a script in Python to run on the main Mental Models page (Mental Models).With a few trials and errors, I managed to run the scraper successfully to generated a file with all the 113 Mental Models listed out and the respective 8 categories. `insert cmd line img of scraper`
The next task here was to segregate the mental models into their respective categories into different json files. The main purpose of this task is for them to be easily inserted as collections into a NoSQL database on the next step of this process.
MongoDB and mLab
The data collected in the previous step needed to be stored in a MongoDB database instance running on the cloud. Snooping around a bit, I came across this wonderful utility called mLab, which had a free service account for a 500 MB limit. Perfect for a first time setup and more than enough to host my data. I went about migrating my data in the form of JSON files to the MongoDB server in the form of collections denoting each category. Opened up a terminal instance on my machine, and tested the MongoDB database running on mLab. Good, the database is up and running!
Node.js
To build this API, the next essential step was to build a server to feed data to the client on the call of an API endpoint. Node.js seemed the obvious choice to me, since I had worked on that technology before, and it is well-documented with a lot of active developers choosing to build projects using it.
The npm respository(node package manager) has a lot of useful packages to help you get started with your projects, across a wide range. One of the MongoDB driver package is Mongoose. It provides an interface between your server and the MongoDB database instance, facilitating ease of data transfer from the server to the client. With this connection in place, the next task was to add basic CRUD(create, read, update and delete) functionality to the project. The Hapi node package provided an excellent framework to help get this in place, assisting in providing GET, PUT, UPDATE and DELETE actions to the API. A similar and more popular package to Hapi is the Express package, but I chose Hapi just to try out something new, and it did not disappoint at all.
The Node.js server needed to be running continuously for the API to be useful. Heroku was the option I choose to get this accomplished. The setup is straight-forward if you are familiar with git commands and it basically runs a Node.js instance on the cloud. I migrated the server file along with the package.json file to install the dependencies on the Heroku instance. On setup, Heroku provided me with a beautifully esoteric name for the instance running on its server. There, the API endpoint for Mental Models is finally up and running! The eight different categories are indexed to provide an efficient way to call the desired API endpoint and subsequently get the data from the selected category.
![]() |
| Return Value of API call |
Conclusion
To wrap this article up, it was a great experience working on building this project, joining the dots from things I had learned previously and getting it running on a server. I hope to make use of this API in an upcoming project, enhancing the already rich trove of knowledge present on the Farnum Street blog.


Comments
Post a Comment