AIFlash

Serverless app utilizing generative AI to transform raw study notes into interactive flashcards. Built with an Angular frontend and a scalable AWS backend, it features 3D card animations and real time AI processing via OpenRouter.

Visit Live Demo

Tech Stack

Development Blog Posts

Continuity

March 18, 2026 - Part 1

I really enjoyed building the Crawler and Universal Room Booker (URB) projects recently, the thrill of building projects from scratch in a day was an amazing experience. So much so, that I wanted to do it again, but this time with a different focus. I've never used AWS, and the idea of building something without worrying about servers or infrastructure is really appealing to me.

To give more context behind my growing curiosity around serverless architecture, I recently had a technical interview where the interviewer (really nice guy) was telling me how his software engineering team works. He described how he and his team transitioned core backend systems to an event driven, serverless model on AWS because it reduced operational overhead and allowed them to focus purely on business logic. That stuck with me because up until that point, most of my projects had been focusing on features. But hearing someone frame architecture as a strategic advantage, something that directly impacts scalability, maintainability, and team velocity, made me realize there was a deeper layer of engineering I hadn't fully explored yet.

— Montasir

Architecture

March 18, 2026 - Part 2

I've decided on the idea of an AI powered flashcard generator for students. The core concept is simple: users input their raw study notes, and the app uses generative AI to transform those notes into interactive flashcards. The stack will simply be Angular and, of course, AWS.

But, I feel like this project should be more for me to get comfortable with AWS and exploring the interface and services, rather than building a super polished product. Again, I've never used AWS, so stuff like Lambda, API gateways and S3 are all new to me. I want to make sure I understand how they work and how to use them effectively.

— Montasir

Implementation

March 18, 2026 - Part 3

The implementation started with the "brain" of the app: an AWS Lambda function. I chose Node.js for the runtime since it's lightweight and works well with the async nature of AI API calls. Getting the OpenAI library to actually run in the AWS environment was the first real issue. I had to learn about Lambda Layers to package my [node_modules] properly so the function could actually import the dependencies it needed to talk to OpenRouter.

Next, I had to bridge the gap between my Angular frontend and the backend. I set up an HTTP API via AWS API Gateway. This acted as the front door, taking the notes from my UI and triggering the Lambda. A huge chunk of time went into configuring the CORS. It's one of those things that works fine on localhost but becomes a brick wall in the cloud if you forget to explicitly tell AWS to trust your specific frontend origin.

I used S3 for static website hosting, but eventually used AWS Amplify because it handled the SSL certificates and deployment pipelines much more smoothly.

Seeing the data flow from a textarea in Angular, through an API gateway, and processed by an LLM in a Lambda was pretty satisfying. It felt less like writing code and more like assembling a high performance machine.

— Montasir

Testing

March 18, 2026 - Part 4

Testing was a lesson in cloud latency. Initially, I was getting "Unknown Errors" in the console. After looking through CloudWatch logs, I realized my Lambda was timing out. The default 3 second limit wasn't enough for an AI model to think and respond. To improve SSL handshake performance, I increased the memory allocation and extended the timeout to 30 seconds to provide the function with more CPU capacity.

I also ran into issues with the AI response format. LLMs like to talk, but my frontend needed strict JSON. I had to refine my system prompt to ensure the AI returned only the raw array. I tested this by feeding it messy, unstructured lecture notes.. like my OS notes on Logical Address Spaces and verified the cards came back correctly categorized with consistent IDs.

Once the backend was stable, I tested the end to end flow from the live Amplify URL. Seeing the 3D flip animations work with real data pulled from the cloud confirmed everything was finally solid.

— Montasir

Reflection

March 18, 2026 - Part 5

Looking back, the biggest takeaway wasn't just learning AWS, it was learning how to think in terms of managed services. In my previous projects, I was responsible for everything. Here I learned to delegate, I let S3 handle the files and Lambda handle the logic. It's a shift in mindset from building a server to building a system.

I also think that Full Stack in 2026 could really mean "Full-Stack Cloud". Understanding how to configure a VPC, manage IAM roles, and debug a distributed trace in CloudWatch is just as important as knowing how to center a div in CSS. The complexity moved from the code itself to the connections between the services.

AIFlash started as a way to get comfortable with the AWS console, but it turned into a blueprint for how I want to build software going forward; fast, decoupled, and infinitely scalable.

Image 1

Time to study..

— Montasir

Montasir Moyen - Software Engineer | Full-Stack Developer - Boston