Table of contents
After participating in my first Write-a-thon, I proceeded with development as usual. One day, Hashnode announced another Hackathon, this time partnering with Linode, a cloud hosting company that provides virtual private servers.
The challenge was to create an open source project using Linode in some way or another, and attribute both Hashnode and Linode in the project.
At first, I wanted to dismiss the challenge as out of my scope. Learning new technologies always brings you back to a level of inferiority when things don't work, especially when you get by just fine with the tools you're already accustomed to. Even when I checked out their website, I was overwhelmed by everything they provide: higher education, ddos protection, saas...what did they all even mean??
I spent a day reading articles and watching YouTube videos on Linode, until I realized one thing I could use Linode for: deploying a NodeJS application. As a Windows developer, I mostly used services to deploy my apps (such as Heroku and CapRover). However, I recently gained experience with using Linux machines (Ubuntu 20.04) to manually deploy apps during my development at my summer internship.
Thus, all I needed to do was create an application locally, get a Linode virtual private server (vps), and deploy the app there.
In this particular case, I definitely planned to refer to the YouTube videos for assistance when deploying.
The only thing left to do was get an idea of an open source software to work on.
Getting an Idea
Idea generation involves many steps, from market research, to brainstorming, to finding problems that you can create a solution to. I had been stuck on this for a while, until I found a problem during my internship that at the time I could not seem to solve.
Having a Problem
Have you ever needed to integrate software to a server, where you do not have access to the server logs, and no control over what the external API sends to the server?
In my case, the external service would send a request to the server with data when triggered by an external action. Unfortunately, I had no easy way of determining the values being sent in the request, because their documentation was vague. Additionally, I may have been able to replicate the server locally, but I had no clue how to broadcast my IP for the service to send the request there instead (nor did I want to do something potentially unsafe, security-wise).
Finding a Solution
Thus, I wondered if there was a service one could use online to generate an endpoint for them, and they could use that in the external service. That way, when the requests are triggered, they can see what request was made and what data was sent. Unfortunately, I could not find any existing software to do it for me at the time, so I proceeded to make my own!
Me being unable to find existing software was due to me not knowing what to search for at the time. During research and development of 10 Minute Endpoint, I realized I should have searched words like "webhooks", and found Webhooks.site, an existing service that already does what I needed (and also does so much more). However, I still decided to work on 10 Minute Endpoint for the learning experience.
About the Software
Much of the development behind 10 Minute Endpoint was influenced by design restrictions and time limits. I used this project to learn new tools, and I knew I would not be able to add all planned features within the month, so I minimised the goals of the project to remain within scope, and added additional features afterward.
What is 10 Minute Endpoint
10 Minute Endpoint is an open-source service that generates a temporary endpoint for a developer to test and inspect HTTP requests sent to it. This way, they can easily practice making HTTP (or fetch) requests by sending them to this url, or they can use this url in a third party service to easily see the data it sends to their endpoint, which can be useful when developing their APIs that may need certain fields of data.
How it got its name
10 Minute Endpoint (10ME) was originally called Proxyen because of my lack of knowledge on the terminology. I knew of proxies that forwarded requests, and endpoints that managed requests, and mixed those words together.
Throughout further development, I realize that "proxies" was the wrong word to use. Additionally, to prevent overuse of a database (that I don't know how to scale), I decided to follow existing services such as 10 Minute Mail and only provide the endpoint for 10 minutes. After 10 minutes, the endpoint would be deleted, freeing space for other users.
What's its Stack
10ME was developed with the PLEB stack:
- Database/storage with Prisma (MongoDB)
- DevOps/deployed on Linode
- Backend/API with Express (NodeJS)
- Frontend/design using BulmaCSS
How 10 Minute Endpoint works
How it stores data
10 Minute Endpoint uses Prisma and MongoDB as its database.
Why I chose Prisma
I already had experience with applications made using MongoDB and PostgreSQL, but every time, the process of setting up the database, tables, schemas, etc was a tedious task.
Deploying with MongoDB was easier because I could use Mongo Atlas, an online cloud database, whereas in PostgreSQL I had to set up the database in the production server. Previously, I used Heroku to get the PostgreSQL database so they created and managed it for me. But if I had to use a VPS instead of a service helper, I would need to set up the database myself, which would take more time than I would want to spend for the software.
Additionally, I was not sure of what information I would like to store initially, meaning if I had to add more data, I would need to migrate and update table schemas, which would result in more time fighting a database than working on the application. Thus, I chose the easier option that also allows storing dynamic data without migrations: MongoDB.
However, I had only been used to Mongoose for my schemas, and realized I would have to relearn Mongoose for my application's storage purposes. That's when I found Prisma!
Prisma allows you to use many databases such as PostgreSQL and MongoDB, and it would manage database migrations, and changing schemas for you. Additionally, it provided a very intuitive API for creating schemas, querying data, creating records, and more. After another day of research and testing, MongoDB still seemed like the simpler database to use, so I chose to use MongoDB with Prisma.
There were some issues using MongoDB locally, since Primsa required a replica set database and I had a standalone, so I used a MongoDB Atlas database instead, and could continue to focus my remaining time on developing the application.
How it generates endpoints
A major design decision for 10ME was whether user accounts were necessary or not. Maybe with user accounts, endpoints could last longer than 10 minutes.
Scrapping the idea of user accounts
However, a user system would require adding authentication, registering, validation, and more which one usually spends a significant amount of time on.
My previous knowledge used sessions logging in (with passport local), and maybe existing authentication forms may be faster (such as OAuth2), but I did not want to add more data to be stored in the database.
Additionally, signing up acts as a barrier to using a service; I just wanted users to visit the page, get their logs, and leave the page when they're done.
Tracking anonymous users
However, I also needed to keep track of anonymous users, so 10ME won't regenerate the endpoint every time the user refreshes the page. For that, I used express session to keep track of user sessions that visit the service.
When users visit for the first time, a user code is created and assigned to them, and stays with them until it expires. When it expires, it is deleted from their session, so when they visit the page again, a new code is generated and assigned to them.
How it deletes endpoints
Every time the server is rebooted, all previous sessions are reset, so even if a user's 10 minutes are not complete, a new session would be created for them if they refresh the page.
This could possibly be prevented by using cookie sessions, but I have to explore that option.
That means it is possible an endpoint can expire, but not be deleted right away. Rather than set a timer to delete each endpoint as they are created, the server has one global timer that deletes all expired endpoints every 10 minutes.
How it logs requests
Similar to how 10 Minute Mail provides a temporary email, 10ME uses the code generated for the user as their endpoint, accepting requests at the
"https://10minuteendpoint.net/endpoint/<code>" url. At the
"/endpoint" route, requests received are stored with the associated endpoint code, with the request's method, body and query data.
How endpoint logs are displayed
Ideally, when a request is logged, I wanted it to instantly be displayed on the developer dashboard.
One way to do that would be to have an existing websocket connection between the server and developer (client) so the data can be sent to client right away, or server-sent events (SSE) can be pushed from the server to the client on data changes.
However, setting up websockets or SSE turned out time consuming, especially when considering deployment, and having to configure ports and nginx. Thus, the client browser simply polls the server API for new logs. A fetch request to the
"https://10minuteendpoint.net/api/logs/<code>" url is sent every second, and all associated logs are returned to the client.
Disadvantage of polling server
After polling the server for logs, it sends all existing logs to the client which then creates the associated html to display it to the developer. Sending all the log data on each request can become resource intensive for the server, potentially causing performance issues.
Slice of Life Optimizations
There are many things I was unable to add due to time, which I would add if I'm able to continue working on 10ME:
Lowering Potential Performance Issues
To minimise the potential performance issues, I can limit each endpoint to store a maximum of 25 logs at once, with new logs replacing older logs in a first in first out flow.
Spam Protection (DOS Prevention)
It is possible that malicious users can send requests requests their endpoints on loop (or even try logging random endpoints). To prevent DOS (Denial of Service) attacks, I can use Umpress in NodeJS on all routes.
Fortunately, Linode automatically detects and mitigates such attacks from happening.
10 Minute Mail has the option to extend email addresses to 10 minutes, or renew expired email addresses. To add these features to 10ME, I can create an API that increases the expiry time of the endpoint, but that would only happen if the endpoint is not deleted by the 10 minute loop the server uses.
Currently, once the 10 minutes have passed, the user no longer has access to their logs or the endpoints. Seeing as this is just for testing purposes for the developer, I don't see it as something they can lose data with.
In terms of service usage, 10ME successfully solves the original problem I had of testing webhooks and endpoints (although similar services already exist with more features).
In terms of open source software, 10ME is robust and somewhat hard coded, with just enough code to achieve the simple scope of the project. I did follow as much best practices I am aware of, with plans to make the software more flexible in the future.
I also added a readme so other developers can deploy their own instances of 10 Minute Endpoint, and for now it is up to them to extend the functionality for their specific needs.
At the end of development, I learnt of many tools (closed source or bundled with other features) that provided similar solutions like 10ME, such as Webhook Site, Pipedream, Request Catcher, Endpoints, and PTSV2. However, there is still room for the likes of 10 Minute Endpoint.
10 Minute Endpoint logs requests as minimally as possible, does not store user data, deletes endpoint data as quickly as possible, and aims to be used as easily as possible, prioritising simplicity and smooth intuitive usage flow.
Despite the spaghetti code included due to the limited scope and time available, I did learn many new things developing this project, such as database management with Prisma, webhooks, deploying to custom virtual private servers, and how to use the Bulma framework to style your applications.