At any rate, the top scientists in any domain are published internationally. Research and development are highly affected by government investments. IE it’s government grants that are going to our top universities. Once the R&D is done, engineers take over.
Engineers are employed by private companies for the most part. They have to innovate in the building of things and then sales and marketing take over. The design part is critical and requires lots of creativity and a strategic view of the market. It helps to have a large customer base (or lots of seed money in order to grow the market).
IT in the US and elsewhere is about 30% engineering, 25% implementation and the rest sales and marketing. But the open source R&D makes it different than, say, automobile manufacturing. On the other hand, the automobile aftermarket is like the open source. In any case the top engineering and design in IT and computer science is native to the US. It’s a combination of things that make the US the innovator, and a great deal of that has to do with our markets.
There is very little commercial software that comes from anywhere but the US. Nobody else innovates like we do. The fact that we have a large contingent of Indians in the US IT industry has little to do with a US need for R&D. It is very much about the phenomenon of outsourcing and the market for enterprise software.
The basic dynamic is this: The Fortune 500 is chock full of upper management who have, for the past 20 years, had almost zero MBAs with computer science undergraduate degrees. They. Do. Not. Know. How. To. Hire. Tech. Professionals. So they outsource. That is why companies like Deloitte and the former PWC (now owned by IBM) and the former Anderson Consulting (dismantled after Enron) and EDS (Ross Perot, remember him?) dominated in the 80s, 90s and basically all the way until the dot com boom. In the meanwhile, DEC went broke. Xerox computing went broke. Eventually Compaq went broke. Sun Micro went broke. Silicon Graphics went broke. A bunch of smaller computer and software companies consolidated and then went broke. Intel crushed a whole lot of companies. So all kinds of R&D was done in the US but the Fortune 500 did not hire smart people who have always been around in the US. So as soon as they found out they could outsource technical support, boom the bottom fell out of the markets and wages went down.
I saw this three ways, first as a call center rep and then as an IT consultant and finally as a VP of Services. In this last job it was made perfectly transparent to me that ‘business as usual’ meant essentially ‘sweatshopping’ foreign born IT staff and charging high rates for their services. That meant young people with hyped up degrees from schools nobody in the US ever heard of (and are not so very prestigious in their home countries either) can get over here on work visas and be paid $30/ hour while a company like Deloitte (for example) will bill them out at $150/ hour. This is the dirty secret of the IT industry.
For the past 10 years, roughly, I have been working and building stuff from AWS. (BTW a lot of the underlying tech of AWS was built by Twilio[1]. ) And nobody has needed any scientific research into anything AWS does. I can tell you without a moment’s hesitation that no PhDs are needed to build anything AWS has offered over the past decade. And yet still, most of the Fortune 500 has only adopted the cloud in the past 4 years or so.
But let’s put it starkly shall we?
If you’re going to make a real dent in the digital world where you need PhD level brains, then you would be making an operating system[2]. Clearly the biggest revolution of the past 3 decades is Linux and the Open Source movement. So first let me state the obvious. In cellphones the world is basically Apple’s iOS vs Google’s Android. The rest of the market splits up various ways. But in supercomputing and cloud servers Linux dominates these originated out of the efforts of Linus Torvalds of course, but was modeled after UNIX invented at Bell Labs, and BSD (Berkeley System D, obviously from UC Berkeley). Look at the heritage of Linux[3]. We didn’t need any imported labor to create the operating systems that at the core of the computing world, nor did we need it for the programming languages that dominate. Everything else is derivative. And stating more obviousness, TCP/IP, the network protocol of the Internet was developed by the US government, and Ethernet was developed at Xerox.
To understand this at a deeper level, get into this by Eric Weinstein[4].
That study was a key link in a chain of evidence leading to an entirely different view of the real origins of the Immigration Act of 1990s and the H1-B visa classification. In this alternative account, American industry and Big Science convinced official Washington to put in place a series of policies that had little to do with any demographic concerns. Their aims instead were to keep American scientific employers from having to pay the full US market price of high skilled labor. They hoped to keep the US research system staffed with employees classified as “trainees,” “students,” and “post-docs” for the benefit of employers. The result would be to render the US scientific workforce more docile and pliable to authority and senior researchers by attempting to ensure this labor market sector is always flooded largely by employer-friendly visa holders who lack full rights to respond to wage signals in the US labor market.
The creative genius in the digital world is American, and it started before Bill Gates.
Footnotes
Recent Comments