Current odyssey
Current, 2025

Wednesday, May 21

June

Summer is here. Moving to SF for the summer, pretty stoked! Summer internship as a AI Research Engineer at a AI/Web3 startup, H011YW00D - an AI agent running on BASE (Ethereum L2) that generates videos, currently active on Twitter. My role involves working on the AI engine and the underlying distributed computing and deploying and maintaining it.

My speculations about the Web3 world aside, and the fact that crypto is 99% a scam, innovation and profitability are two separate things. Businesses don’t need useful technology to make money. I see potential in companies working not “in” but “with” the Web3 space. They provide utility for users working in that space. Companies like Coinbase generate massive returns despite limited utility. (Coinbase is funded by Tiger Global and Y Combinator.)

Most are financially successful businesses built on technically questionable foundations, not meaningful innovations solving real problems. The bottom line is, distribution trumps innovation. Not that I see myself working in the web3 space, but this stint feels like a decent opportunity to learn more about the space and the money behind it.

On the other side, YourMove.ai is going strong. We’re generating massive traffic—and being a tiny, bootstrapped team, we’re profitable. That’s the thing about SaaS: execute well, and it pays.

May

The semester is almost over. I feel I am done with building projects now. They still excite me, but the monetary aspect of it is more appealing now. I’d like to focus on building and solving for problems that are more business-oriented. Summer internship starts in a few days. At 24 C, New York feels warmer now. People are out and about, and the city is alive. I’d like to save up for a few trips this summer, mostly exploring cities off the east coast. Maybe Miami, Atlantic City, and a few more.

Jai Hind :) Orion is going good. The device is currently capable of recording various muscle signals based on activity, and the data shows distinguishable patterns between relaxed, fist, index, and index-middle finger positions. It conditions signals using moving average filtering, rectification and envelope signals with additional raw signals for improved accuracy. The challenge here is improving the deep learning model that has a terrible accuracy of around 60-70% for the current dataset. The annoying part about this is that the data is terrible. Primarily, because the jumper wires kept falling off, and the electrodes were not placed properly. Also, the muscles get fatigued after a while.

arm arm
Left: Arm with electrodes and sensors, Right: MCU with electrodes and sensors live

Further, in the next semester, I am planning to integrate a language recognition model that would be used to convert the signals into text. This has a lot of potential applications, including controlling devices, typing (air keyboard), and even hand-gesture to-speech conversion.

New York in spring is beautiful as ever. The highs are around 27-28 and the lows are around 14-16, the sweet spot. The cherry blossoms are in full bloom, and the city feels lively. Charting out a few short trip plans for the summer. East coast isn’t as scenic as the west coast, but, let’s see how it goes.

Currently watching When Life Gives You Tangerines, a Korean series that beautifully explores themes of love, loss, and self-discovery. The story follows a young woman as she navigates the complexities of relationships and personal growth. Stunning cinematography and heartfelt narrative. Also, You, a psychological thriller series that follows Joe Goldberg, a charming yet dangerous psychopath who becomes obsessed with people he loves.

Last ten odds days were disturbing, was mostly glued to the news. The situation back in home is heartbreaking. What is more annoying is all of the news channels are too polarised, thus, instead of information, they spend more time on sensationalism. I have faith that the India will have a response ready. Vengeance is coming. Jai Hind.

April

Too many projects and short stints; dropping all projects for now. Planning to slow down a bit and try to focus on one clear domain to build something tangible. Let’s see how this turns out. New York life is beautiful. Spring is starting, and the city looks lovely! Also got a new gig that I am pretty excited about. I got a Tech Product Management internship for this summer at YourMove.ai, an AI-based product that helps you in your dating life.

College life is slow but more meaningful. Formal education and I seem to be done with each other. Thus, this semester, almost half of my credits are just projects. Ferry is almost done, and I am focused on wrapping up Orion. (More details here). Will try to keep up the pace next semester as well and graduate early. Spending most of my time on product strategies, design, architecture and case studies: product market fits.

Helped out a friend this weekend build an AI-accelerated hardware simulator in ‘C’ and Verilog to demonstrate vector multiplications and speedup in hardware. The benchmarks show massive speedup in TPUs and NPUs: O(1) vs O(n) in CPUs by running the multiplication in parallel.

This weekend was productive. Decided to revisit an old decentralised encrypted chat app I built three years ago, and here is the v2: https://anubhavp.dev/fireside/. Also, implemented a global search feature in this blog. Search for any content in the blog on any page, using the search icon, or ctrl/ cmd + k or /. The search is implemented using a custom prefix-matching plus a fuzzy search engine that uses the BM25 ranking algorithm. At times, there are these one or two out of a thousand scenarios where I sometimes contemplate that maybe Leetcode has had some value in my software engineering journey.

F-ed up my pc for a day. Ferry is not programmed to handle malformed code, and the IR-assembly code generation missed adding epilogue to this one for-loop in a test.c file. Ferry uses spike to emulate the RISC-V assembly code, and it started an infinite loop with seg faults and stack overflows. I quit the emulator (ctrl + c) and was on my way, but spike was silently running in the background. A day later, my CPU Fan started roaring, and the laptop was heating up. Initially, I ignored but later, upon inspection, it was clear today why tests are important in your code. Spike was hogging 99.5% CPU usage.

gemini-2.5-pro is the new hot thing in the AI space. Reddit users are reportedly receiving massive performance differences from Claude 3.7-sonnet. On the contrary, my experience has not been quite the same. When I asked it to analyse Ferry, my C compiler project, it did successfully analyse it in one shot, identify bugs and suggest improvements. But, the code generated by Gemini 2.5-pro was buggy, with unreadable characters (this one time it generated a weird character which I hadn’t seen in ASCII before xD), a lot of unused variables amongst others. The code also didn’t solve the problem.

Currently, Ferry is riddled with bugs, primarily in the IR to Assembly space, mainly becuase I mostly wrote all the template code myslef and asked Claude to fill in the rest.

March

Claude finally has a web search feature; I can finally let my Chatgpt+ rest in peace now. Well done, soldier. You did well. (Immediately regrets after looking at the Ghibli art images everyone is generating) :(

Happy Holi folks! I had a lovely Holi celebration here this weekend. Currently wrapping up Ferry, a C compiler in Rust. The compiler is currently capable of parsing, tokenizing, generating AST and an optimised intermidate representaion of the code for a subset of C language. The following steps would be to compile IR to RISC-V assembly.

Compiled languages are always faster than interpreted languages, which is why ​Microsoft has also announced a native port of the TypeScript compiler to Go which earlier was written in TypeScript.

I am also contributing to a process isolation sandbox project, LIND, at the Secure Systems Lab at NYU and in some research about Meta’s Orion Glasses, particularly focusing on hand-gestured controllers.

The internet is going crazy since the launch of Claude 3.7 sonnet and for the right reasons. Claude 3.7 destroyed benchmarks, which is a significant improvement over other models. I tried out Claude, and it’s just insane. For instance, I asked it to review an article I wrote earlier with the default prompt provided by Anthropic (look for the polish your prose suggestion in Claude), and it just nailed it.

Now the question is, do I replace my Chatgpt+ subscription with Claude? Claude doesn’t have a web-search feature, yet. But I switched, crossed the line, and made the jump to the dark side. I wish it had a web search feature. It could be a separate entity, like a separate web scraper model, or maybe a separate tab in the app, or something. This would solve all my problems and would be a game-changer. I used Chatgpt for every single thing, from research and coding to analysing nutritional values of food. For example, every day, I ask Chatgpt to find the nutritional values of a food item, like say a particular branded milk/ oat milk or a coffee from Starbucks, and it still does a great job at it. To maintain a log, I ask it to save all the results in a markdown file and return.

The other solution now is using Google as usual (Google does a better job than the likes of Perplexity at finding relevant sources, BUT, for summarising, paraphrasing & “how to” queries, an LLM-based search engine is better. Refer to this article), and then copy-pasting the results to Claude.

Is it March yet? I remember going back home, to India, for a break, like it was yesterday and now two months are already over? Strange. On this side of the world, I am currently building a compiler for a subset of the C language to RISC-V assembly language. It’s a fun project that helps me understand how compilers work. I also joined the SSL (Secure Systems Lab) at NYU, and am involved in a project aimed at building a secure system that aims to isolates processes in a single process sandbox. This helps limit the damage caused by bugs or security flaws in the application. Also, involved in some research about Meta’s Orion Glasses because I found the hand-gestured controllers fascinating. I am planning to work this semester on a controller as such. Excited to see how all these unfold in the coming months.

February

The highlight has to be HackNYU, a 48-hour hackathon that I participated in last weekend. We built Neighbourly, A Hyperlocal community inventory management app for you to donate, borrow, or exchange food and essentials with those in need. We didn’t win, but I am proud of what we accomplished in the given timeframe. Here’s the gist of the project on Devpost. We built the skeleton in a few minutes using tools like v0.dev, blot.new, rollout.site, and presentations.ai. It’s crazy how AI has changed the landscape of today’s development.

Tech is a commodity now, anyone with a slight sense of what they are doing can build something. I don’t see the point in hiring a lot of developers to build something that can be done by a few now. I get what FAANG companies mean when they lay off a lot of devs. Not that I support that, but, it just makes more fiscal sense. The real challenge is to build something that people want.

Finished watching Paatal Lok - Season 2 this weekend. It’s a gripping tale of crime, corruption, and power, a must-watch for anyone who enjoys Indie crime dramas, and it’s really good. On the other hand, also currently watching a heartfelt, light-hearted show called Ted Lasso. It’s a comedy-drama series that follows an American football coach who is hired to manage an English football team. The sheer optimism and relentless positivity in the show are infectious. This show is so good! If you’re more interested in watching shows about the journey of building yourself, try watching Marvellous Mrs. Maisel. It’s a comedy-drama series about a housewife who becomes a stand-up comedian in the 1950s. The writing is sharp, the characters are well-developed, and the acting is top-notch. Running parallels, Midge’s wins feel personal here, well-deserved. More ->

January

Back in action: January was like an amazing, much-needed, magical vacation back home, and now I am back, and it feels good. I met my cousins, family and relatives, ate a bunch of good Indian food, and spent a lot of time with my friends, still unsure how a month went by and if I was ready to get back to work. My sister got married, and I can’t stop digesting this. She is just six months older than I, and we grew up together, inseparable. I am happy for her, but it feels like a part of me is missing. I can’t wait to finish things here and rush back home.

Currently reading The Short Case for Nvidia Stock by Jeffrey Emanuel, which explains how Deepseek, an open-source LLM, just killed it in the LLM space. The training of DeepSeek-V3 required less than $6 million worth of computing power with Nvidia H800 chips, which is 20 to 50 times cheaper than the cost of training similar models by OpenAi and Anthropic. On Monday, January 27, 2025, the stock closed at $118.42, marking a 17% drop from the previous close. This decline erased nearly $600 billion from NVIDIA’s market capitalisation, setting a record for the largest single-day loss in U.S. history. The stock’s decline was attributed to a combination of factors, including a broader market sell-off, concerns about the company’s growth prospects, and a downgrade from analysts at Morgan Stanley.

Taking a break:

I am going back home, this month. If you’re curious, I will be spending more time on my books, PS5, and Netflix in the next few weeks. Red Dead Redemption, Spiderman 2, Pulp Fiction, Godfather, and a long list await this new year. Hoping to gobble up good food for the rest of the year in this one-month break.

Here’s to new beginnings and new adventures! 🥂 I hope you have a great year ahead. Merry Christmas, and a happy New Year, everyone! Promise to be back soon. Keep checking :)


2024
2023