Crafted with Passion: A Tour of My Technological Triumphs. 

Architected 5-6 independent microservices using .NET Core with PostgreSQL, MongoDB, RabbitMQ, and Kafka for async messaging and event-driven communication. Implemented production-grade auth (OAuth 2.0, IdentityServer) and real-time updates via SignalR; deployed on Kubernetes with API gateway and reverse proxy (YARP), and a Next.js frontend.

Connect 4 Game
Featured Project

Connect 4 Game

The Connect4Game project is a digital version of the classic Connect4, built using WPF for the client and ASP.NET Core for the server. The server handles game logic, player interactions, and AI strategies, while the client offers a dynamic game board and connects via a Web API. With a focus on best coding practices, the project employs MVC and MVVM patterns, ensuring an engaging and seamless gaming experience for Connect4 fans.

Crafted a dynamic song management platform using React and Tailwind CSS, powered by a NestJS backend with MySQL integration. Deployed and streamlined via Docker for seamless user song search and data handling.

Crafted this very site you're exploring! Built responsively with Next.js and Tailwind CSS, it's spruced up with mouse-reactive animations, showcasing my projects and skills in a lively and interactive style. Dive in and enjoy!

Developed a real-time e-commerce data processing system, harnessing the capabilities of Kafka and PySpark to integrate and analyze sales data. This solution enabled prompt insights into purchasing patterns and ensured top-tier data quality, critical for subsequent analytical modeling

An application that plans a traveling itinerary for the user based on his favorite places on Google maps! Each route is calculated by the locations of the user's favorite places, and approximately how long it will take to visit them. Utilizing PostgreSQL for the relational dataset, RESTful API, CRUD, JUnit tests, and much more!

TV Show Runtime
Featured Project

TV Show Runtime

Linux application that gets a list of TV show names and using an HTTP GET request receives a JSON containing all the episodes of that show and calculates its runtime. The application uses parallel computing calls to get each TV show runtime faster and returns the TV shows with the longest and shortest runtime.

The 'Titanic - Machine Learning from Disaster' project on Kaggle focused on predicting the survival of Titanic passengers. Utilizing Python, key libraries like Pandas and Sklearn were employed for data analysis and modeling. The data underwent rigorous preprocessing, which included addressing missing values, encoding categorical variables, and feature engineering techniques like extracting the 'Title' from passenger names. A heatmap was used to visualize correlations between features, providing insights for model optimization. The primary predictive model was Logistic Regression implemented with the SGD Classifier, which was fine-tuned to enhance its accuracy.

More projects will be uploaded soon!