10+ years have gone by since the OP asked this, and Google search trends (within the genre of web programming) still show this as an active debate. I want to give a 2021 answer to this one, and since I've been programming professionally since 1979, and have used every aforementioned language exhaustively, I feel quite qualified to answer it.
More internet devices use C than any other language. They just aren't usually the public facing devices per se (Cisco, F5 networks, etc -- their cores are all compiled in C). But these are embedded platforms with a purposeful function that require almost RTOS-like stability. C will continue to be the dominant language here - it's so old that instability issues aren't a factor.
Big Tech uses C all the time ; you aren't going to find a cron job in their stack that does does a log flush or backup being written in bash or in PHP ; if it's complex it's likely that it's a gcc-linked binary that's doing the heavy lifting. Nothing is as reliable as C since it's been around forever and we're not finding new bugs in gcc.
The tech companies during the 1990s really built up the core of the technologies we use today, and C wasn't sexy [at all] to use. Many apps were developed with Java Servlets and applets which is quickly conducive to an MVC model and it's cross-platform. The latter is very important - as I'll explain.
Microsoft didn't quite embrace C or C++.. their Visual C was a terrible IDE compared to Borland's. But Visual Basic was a hit for them, though, which is why VB became ASP classic - which many websites still use today. It was the first language IIS officially supported (VB or Basic). Classic ASP is considered an old language with no support from MSFT anymore, but its usage was widespread in early web apps who wanted to use Windows NT (or Server) for their development. Bill Gates really wrote the best Basic interpreter back in the day, and the simplicity of Basic allowed for a very quick learning cycle to become a developer. However, Borland's IDE was amazing, but the company blew up (along with Novell and other greats from the 90s).
Nobody in their right mind in the 1990s were developing web apps quite yet for Linux, really.. other Unix derivatives, yes - but commercial versions of them. Many people deployed completely different OS's altogether (think of AIX or DEC VMS here). The words "free" and "great software" wasn't often part of the vernacular. IT managers, who became the de-facto original webmasters, didn't want their paycheck to ride on something free (Linux) because in part the distributions were fragmented (perhaps excluding RedHat's or Mandrake's).
IT Managers weren't programmers - so scripting languages became more popular for web development. Websites written completely out of bash were not unheard of.
Java (which is based on C) became perhaps the most popular notion of the future of the web, partially because Sun had great hardware and was renowned for their Solaris OS. Most C programmers who were interested in the web moved onto Java, it was pretty similar and offered concurrency (C wasn't designed as a concurrent language) and mostly stable garbage collection and memory clean up techniques.
VB devolved at MS and, seeking a compiled language for IIS, moved it up to what's now the .NET framework which is simply VB or C# wrapped in complexity. But IIS's weaknesses were overcome by the sheer number of Windows NT/2000 deployments.
The bubble burst and Oracle's Java, Solaris and DBs weren't a favorite amongst IT managers (though it was a favorite with executives because Oracle's sales team were top-notch). For one, Oracle's products were extremely expensive and they weren't (and really still aren't) known as a cutting-edge technology company.. they are the IBM of late - deploying what works and isolating more of the business problem than the technical one.
Post-Y2K, Linux out deploys just about everything on the web because the distributions started really getting to a commercial-level. This whole idea of open source was working. Windows was suffering from virus issues repeatedly and Apple was written off as a dead entity. Commercial platforms were becoming more under fire as CFO's were tightening budgets.
Linux started with a legacy PHP interpreter from the get go and the language has really grown from there. Most who deployed Apache would use this language simply because it was stable. Again, at this point we still have IT managers running these efforts.
Mid-2000s when Marketing functions started taking over the web presence of their respective companies, the needs quickly changed. Marketing folks love testing (think multivariate A/B campaigns) and so scripting languages really started sneaking up as the best way to accomplish quick changes that could be pushed to the web without having to hardcode HTML.
We forget, the ECMAScript (Javascript) really took off, Flash started to die, and so this whole idea of how to control the DOM came about when faster and faster browsers were able to leverage the UI.
Of all the modern languages out there, the frameworks really have usually been responsible for propelling the success of newer languages. Bootstrap was quickly absorbed when mobile came about, and integration with a legacy PHP app was easier than it would have been with a hardcoded C app. We started seeing mostly scripting take over compiled languages as the choice-du-jour, though .NET has it's place for certain things yet, though that's diminished in the last 2-3 years.
In the end, C is still the most widely used language "behind" the internet. Routers, load balancers, intrusion detection systems.. it'll all be in C (or C++ but more likely, C). These appliances are now moving to being software deployed but the hardware appliances are still used by the larger datacenters. For example, AWS has a wrapper around a grouping of Cisco switches to change the routes based on ToD (time of day) predictions. Amazon, Google, Facebook all realized that a "follow the sun" model of load balancing was a good starting point and really could only trust that to low level hardware devices to correctly do that. Scripting languages and even compiled ones are surfacing that are - for all intended purposes - really based on C anyway. But with the advent of k8s and Docker it makes more sense to rip a node.js framework and start working on a small unit of code that could be pushed independently of what the other pieces of code was doing. Outsourcing was used alot in the earlier days and this is about the only solid way of programming a system without conflict in a larger development environment.
We are still at the beginning of the internet.. in other words, in another 10 years there will be yet another cycle, and it'll likely be away from typing. Drag and drop DAG nodes will be used to create web apps in a similar way that a VFX artist uses Nuke or Maya to build a motion graphics clip. Twilio does this and even training an ML model is already beng done this way using Peltarion.
We're not yet to the highest level languages possible, as our abstractions in Ruby or PHP are still C-like, and throwing dollars at STEM education won't bring in legions of kids to study computer science. However, the barrier to entry will continually be lowered, but the need for a diverse understanding of languages will still be required for the complex applications or specific applications that require "something" like directly addressing a CUDA core (for example).
We are simply seeing a diversity in languages, as predicted by Ritchie himself, come about. Frameworks have matured so it makes no sense to drop a new developer on a C+CGI project when bringing up an Angular scaffold takes 5 minutes.
This was all expected, actually, by most of us from the 1980s. As programmers become diverse, so do the ways to attack the problem in front of them. As open source becomes evermore possible (and incredible!) there's simply no financial advantage to starting something from scratch. Headless CMS systems in fact are already being deployed so that the backend of the CMS doesn't require the dev team to write a line of code - they just worry about the user experience... and even then, the tools are getting more mature.
My point is this: C is definitely used on the web, and embedded systems use it all the time. MVC applications have progressed to where a framework can build a simple app for you within an hour, without compilation, and it happens that those frameworks are more extensible on the newer languages. It's an apples to oranges comparison for me - I still use C to do data normalization (for example), but if I'm being paid to write a web app, I can begin a Vue.js app while drinking a beer, so why work harder?