Optimizing AWS Batch Jobs for Efficient Video Ingestion | glowing-telegram project - Episode 131
In this video, I delve into how we're optimizing our video ingestion process using AWS Batch jobs in our 'Glowing Telegram' Rust/TypeScript web application. After troubleshooting some AWS permissions issues and integrating Pulumi code from our test projects into Python, I've managed to automate video uploads to S3, triggering AWS Batch jobs for metadata extraction.
The focus is on simplifying the transition of manual processes into robust, serverless functions via AWS. We discuss the utility of separating processing scopes within Rust to efficiently reuse input names and handle video files. Additionally, I'm working on introducing step functions for handling transcription across large video files, considering the importance of maintaining context for accurate transcription.
I also touch upon how the front end of the project in TypeScript interacts with the backend processes, leveraging AWS's capabilities to handle our batch jobs effectively. Through this journey, we confront challenges like managing video processing efficiently, both in terms of storage and speed, while ensuring the system flexibility to scale as needed.
🔗 Check out my Twitch channel for more streams: https://www.twitch.tv/saebyn
GitHub: https://github.com/saebyn/glowing-telegram
Discord: https://discord.gg/N7xfy7PyHs