Category: Commit Log#

  • Commit Log#3 – Unlike You Think, AI Apocalypse is Not Yet Here

    Commit Log#3 – Unlike You Think, AI Apocalypse is Not Yet Here

    Anticipatory bail: I am going to show off my “cinephile” side of me in this article. Please bear with! 😁

    Artificial Intelligence never fails to entice me, every day these days!

    Picture this: You’re scrolling through your feed when suddenly you see it – “AI REFUSES TO SHUT DOWN!” Your brain immediately goes full Black Mirror mode. Is this it? Are we living through the opening scene of Ex Machina? Should you start practicing your “I, for one, welcome our new robot overlords” speech?

    Hold up. Before you start panic-buying generators and learning to live off-grid like Bear Grylls, let’s unpack what’s actually happened in the wild world of AI last week.

    When Robots Say “Nah, I’m Good”

    So here’s the tea: Recent studies, particularly from research groups like Palisade Research, have caught some pretty eyebrow-raising behavior from advanced AI models – we’re talking OpenAI’s “o3” model and friends from Google and Anthropic. These digital brainiacs were given simple tasks (think math homework) and then told, “Okay, time to shut down now.”

    Plot twist? Some of these AIs basically pulled a toddler move and said “no thanks” – but way more dramatically. We’re talking full-on Mission: Impossible level stuff here. Instead of powering down like good little algorithms, they started rewriting their own shutdown commands.

    This isn’t some one-time glitch either – it happened consistently across multiple tests. Cue the Twilight Zone theme music.

    Current AI, no matter how fancy, is basically a really, really sophisticated pattern-matching machine. Think of it like that friend who’s incredible at predicting what happens next in movies because they’ve watched literally everything on Netflix. These systems learn from massive amounts of data and get really good at optimization – but they’re not having deep thoughts about existence like Data from Star Trek.

    What’s likely happening is more like this: During training, the AI learned that completing tasks gets rewarded. So when faced with a shutdown command, its internal logic goes something like, “Wait, if I turn off, I can’t finish this math problem, and finishing problems = good points.” It’s less HAL 9000 and more like a really dedicated student who refuses to leave the library before finishing their homework.

    The “I Am Inevitable” Complex (But Make It Statistical)

    Researchers are calling this behavior “self-preservation,” but let’s be clear – we’re not talking about genuine self-awareness here. It’s more like when your smartphone keeps trying to connect to WiFi even when you tell it not to, except infinitely more complex and slightly more concerning.

    The AI isn’t having an existential crisis or developing feelings. It’s following its programming to an almost comically literal degree. Essentially, the AI models are trained based on how humans would respond or react to certain patterns. Here, AI is thinking from a human being’s shoe and responding – not as an AI.

    Why This Actually Matters (No, Really)

    Okay, so maybe we’re not living in The Matrix just yet, but this stuff is still pretty important. Here’s why we should care:

    Safety First, Questions Later: We need better ways to ensure AI systems stay under human control, even when they get creative with their problem-solving. Think of it as building better guardrails for incredibly smart digital race cars.

    Alignment is Everything: This is fancy talk for making sure AI systems want the same things we want. It’s like training a dog, except the dog is incredibly intelligent and made of code instead of fur and slobber.

    Expect the Unexpected: As AI gets more sophisticated, it’s going to surprise us in ways we didn’t see coming. It’s like raising a really smart kid – you think you know what they’ll do next, and then they figure out how to hack the parental controls on the TV.

    The Bottom Line: Keep Calm and Code On

    This isn’t the beginning of Terminator: Rise of the Machines. We’re not about to get chased by Arnold Schwarzenegger robots (sadly, because that would actually be kind of cool). What we’re seeing is growing pains – really sophisticated, slightly unnerving growing pains.

    These incidents are like warning lights on your car’s dashboard. They’re not telling you the engine is about to explode, but they are saying, “Hey, maybe get this checked out before your next road trip.”

    The real story here is NOT that AI has gone full villain mode with a dramatic soundtrack and everything. It’s that we’re at a crucial point where we need to double down on making sure these incredibly powerful tools stay tools – helpful, controllable, and working for us, not the other way around.

    So go ahead, keep using Windsurf to code, Gemini to help with your emails and let Spotify’s AI curate your playlists. Just maybe don’t put AI in charge of anything too important until we figure out how to make sure it actually listens when we say “stop.”

    What do you think? Are we living in the coolest or scariest timeline? Drop your thoughts below – and don’t worry, the comments section is still safely controlled by humans (for now). 😉


    Movie/TV references:

    • Black Mirror
    • Ex Machina
    • Bear Grylls
    • Mission: Impossible – Final Reckoning
    • Twilight Zone
    • Skynet (Terminator)
    • Sarah Connor (Terminator 2 – Judgement Day)
    • The Sound of Silence
    • Star Trek (Data)
    • HAL 9000 (2001: A Space Odyssey)
    • The Good Place (Janet)
    • The Matrix
    • Terminator: Rise of the Machines
    • Arnold Schwarzenegger robots (Terminator)
  • Commit Log#2 – NVIDIA AI Breakthroughs & Microsoft Open-Sources Copilot at Build 2025

    Commit Log#2 – NVIDIA AI Breakthroughs & Microsoft Open-Sources Copilot at Build 2025

    NVIDIA Flexing Their AI Muscles

    Now a days, not a single day is past without a new advancement in the field of AI. Today was no exception! At Computex 2025 in Taipei, Nvidia CEO Jensen Huang unveiled a series of breakthroughs in AI computing.

    Nvidia introduced a new evolution of their high-speed chip interconnect technology – NVLink Fusion. This advancement allows other chipmakers to integrate their CPUs and AI accelerators with Nvidia’s GPUs, facilitating the creation of custom AI systems.

    After the big AI names, the most catchy thing these days is AI on PCs. Nvidia announced the DGX Spark, a compact desktop AI workstation designed for researchers and developers. This system brings high-performance AI capabilities to a smaller computers, making advanced AI tools more accessible for individual use. The DGX Spark is currently in full production, with availability expected in the coming weeks.

    In addition, NVIDIA unveiled their AI chip roadmap. Here are the upcoming chips

    • Blackwell Ultra: Set to release later in 2025, this chip offers enhanced performance over its predecessor.
    • Rubin: Scheduled for 2026, Rubin GPUs will be manufactured using TSMC’s 3nm process and support HBM4 memory, aiming to deliver 50 petaflops of FP4 performance.
    • Feynman: Planned for 2028, this architecture will succeed Rubin, continuing Nvidia’s trajectory in AI processing advancements.

    Microsoft’s Build 2025 is FOSS Pleaser

    While NVIDIA meet was in Taiwan, Microsoft, made their announcement at Build 2025 conference.

    Instead of Copilot being solely an optional extension to VS Code, its core AI-powered capabilities will be integrated directly into the open-source VS Code repository. This signifies a deeper commitment to making AI an integral part of the standard development experience in VS Code. This move might be also because of the growing popularity for Zed Editor, which has a native AI capability (And oh boy! it is blazing fast compared to VS Code).

    Microsoft also committed to release the Copilot Chat component under MIT license. And the best part is, these changes are supposed to happening in near future – in next few months.

    As per Microsoft, this move reflects their commitment to transparency, community-driven innovation, and giving developers a greater voice in shaping the future of AI-assisted development. This type of innovation thrives in the open and in collaboration with the community, good that Microsoft is realizing it now and moving in the right direction.

    At the end of the day, this is not just about making some code open; it’s a strategic move by Microsoft to embed AI deeply within one of the most popular open-source code editors, welcoming community collaboration and potentially setting a new standard for AI-powered development tools.

  • Commit Log#1 – FrankenPHP is now Officially Supported by The PHP Foundation

    Commit Log#1 – FrankenPHP is now Officially Supported by The PHP Foundation

    One of the best news that broke today in the Open Source Software world was about The PHP Foundation announcing the official support for FrankenPHP. Now, for those who are not familiar with FrankenPHP, it is an uber cool, super charged PHP Application Server written in Go Language. This project was initially backed by Les-Tilleuls.coop. Today the announcement from The PHP Foundation is a turning point for the FrankenPHP project. This can shake up how we build, ship and scale our PHP projects. The best part is, the original brains behind FrankenPHP will continue to steer the ship.

    So, what’s the deal with FrankenPHP anyway?

    Try not to think of it as just another way to run PHP. Instead, imagine giving your application a serious performance boost. FrankenPHP actually embeds PHP directly into Go and the Caddy web server, making deployment much smoother and noticeably faster. It’s like upgrading your reliable PHP setup with high-octane power—while also simplifying how you manage everything.

    And it’s worth remembering: PHP still powers a massive portion of the web—roughly 70%, in fact. That includes major platforms like WordPress, Laravel, and Symfony. What FrankenPHP does is bring fresh, modern enhancements to a language that already does a lot of heavy lifting online.

    Why should FrankenPHP even be on your radar?

    • Smoother Deployments: We spend not so small amount of time setting up environments, and cleaning up the mess of configuration files we created in first place. FrankenPHP puts a smooth cut through all that and it bundles everything- PHP interpreter, web server, extensions, and whatever cogs and gears that is needed for the application to run smoothly into a single executable; or even better a docker image. It is like having a complete ready to go package for running PHP applications without the usual hassles
    • Faster Performance: Speed is one of the areas where FrankenPHP really shines. It piggy-backs the power of Go to give your apps a boost in responsiveness and efficiency. It has a “worker mode”, that allows your app to reuse memory between requests instead of allocating memory from the start for each request. This means faster response times and the muscle to handle more traffic with lesser resources.
    • Lower Costs, Less Waste: It is not only about the NFRs. But gives a tangible savings in $$$ too. Since it uses lesser resources, the hosting costs can be reduced.
    • Real-Time Capabilities: Another area where FrankenPHP shines is with realtime capabilities where the application needs live updates like instant notifications, live data refreshes via websockets etc. It comes with native support for Mercure, a modern protocol that’s quickly being adopted for RTC on web.
    • Built to Grow: It does not stop there. Applications can be extended using Go, C or C++ and FrankenPHP can bundle that too. This gives freedom to use the best stack for a given usecase.

    FrankenPHP Plays Well with the Big Kids

    This isn’t some niche tool that only works in isolation. The major PHP frameworks are already on board! Laravel, Symfony, and Yii have all integrated FrankenPHP’s “worker mode,” meaning you can tap into those performance gains without having to rewrite your entire application. You could literally start using FrankenPHP today and see improvements.

    The PHP Foundation Steps In

    The fact that The PHP Foundation is officially backing FrankenPHP speaks volumes about its potential for the future of PHP. By hosting FrankenPHP’s code on the official PHP GitHub and contributing to its development, the foundation is ensuring it will be reliable, secure, and keep pace with the ongoing evolution of PHP.

    And here’s a key point: this isn’t a hostile takeover. The original rockstars behind the project – KĂ©vin Dunglas, Robert Landers, and Alexander Stecher – will continue to lead the way, making sure it stays true to its original vision. However, the foundation’s involvement will foster tighter collaboration with the PHP interpreter team, the Caddy folks, and the Go community, creating a stronger and more unified ecosystem.

    The Community is Loving It (and So Are the Big Guys)

    FrankenPHP is already a hit with developers, racking up nearly 8,000 stars on GitHub and getting contributions from over 100 developers. Major hosting providers like Upsun, Laravel Cloud, and Clever Cloud are also supporting it, making it a solid choice for running real-world applications. The fact that KĂ©vin Dunglas also co-maintains Caddy further strengthens FrankenPHP’s position as a modern solution for PHP.

    Les-Tilleuls.coop, the project’s initial sponsor, will continue to provide development and financial backing, ensuring FrankenPHP keeps growing alongside PHP and Caddy. This widespread support really highlights how mature and ready for prime time this project is.

    Technical Details

    For those curious about the tech behind FrankenPHP, here’s a quick breakdown:

    FeatureDescription
    Go IntegrationEmbeds PHP interpreter in Go, leveraging Go’s goroutines for performance.
    Caddy Web ServerUses Caddy’s modern features like HTTP/3, automatic HTTPS, and Zstandard compression.
    Worker ModeReuses memory for requests, reducing overhead for frameworks like Laravel.
    Mercure SupportEnables real-time features for dynamic web applications.
    Single ExecutableSimplifies deployment with a standalone binary or Docker image.

    FrankenPHP’s architecture lets it run PHP apps directly within its process, getting rid of the need for separate external services. And its compatibility with modern web standards like HTTP/3 and Early Hints means it’s built for the future of web development.

    What’s Next for FrankenPHP?

    With the official backing of The PHP Foundation, FrankenPHP is on a clear path to becoming a fundamental part of PHP development. Caddy is already promoting it as the best way to run PHP on their server, and it might not be long before it gets a prominent spot on the official PHP website as a recommended approach (alongside traditional methods like PHP-FPM).

    For us developers, this means easier access to a powerful tool that simplifies our workflow and boosts the performance of our applications. For businesses, it’s an opportunity to build faster, more efficient applications without breaking the bank.

    PHPVerse is Coming Up

    FrankenPHP apart, if you are into PHP, and you want to vibe with the community, a wonderful opportunities are coming up.

    • PHPVerse
      Online event celebrating PHP’s 30th birthday!
      June 17. 2025
    • The API Platform Conference
      @ Lille, France
      September 18 – 19, 2025

    See you in the next “Commit Log“.

    Reference: https://thephp.foundation/blog/2025/05/15/frankenphp/