"Gates knew that modern technology would require fundamental change every five to 10 years," she said. "He explained how the Internet would be really disruptive, perhaps in a creative way. It would affect the entire value proposition of what was being offered not only by his company, but by others."
But a lot of the heavy lifting came from a wide variety of companies, according to Bozman. She listed Netscape (GUI-based browser); Sun Microsystems (Java); SGI (visualization graphics); Red Hat and SUSE with enterprise Linux; Google, Docker and Sun (containers); VMware (virtualization); and Cisco, Broadcom, Brocade and Arista (routers and switches), among others. Importantly, Microsoft and Sun eventually forged more platform interoperability for Internet developers, who were on their way to widely adopting Linux-based technologies. One sign of this technology sea-change is that today, Microsoft owns Github, the app developer repository for code, and Microsoft Azure runs Linux and Windows.
Bozman pointed out that despite being criticized as a latecomer to cloud, Microsoft hustled to make Azure the second largest cloud service provider.
She remembers seeing technology demos in the 1990s led by Grove that showed early capabilities of the Internet, including video chat. She said Grove’s decision to get Intel out of the DRAM business — memory chips that drove the bulk of Intel’s business — and focus on microprocessors changed the course of computing for the masses.
Intel’s processors not only fed the PC and server revolution of the 1990s and 2000s and later the cloud computing revolution, it was emblematic of the kind of radical risk-taking required at times to keep a technology company from being surpassed by more innovative competitors. Bozman said the majority of servers and storage devices used in enterprises and cloud service providers today are based on Intel x86 microprocessors, though she noted that IBM Power, AMD processors and Nvidia GPUs play strong roles, too.
"These two visionaries – Gates and Grove – don't take us to cloud, but they saw big change coming ahead of many others," Bozman said.
The Internet and Quest for Interoperability
Data centers in the ’80s and early ’90s typically had large mainframe computer systems. Many of them, Bozman noted, remain vital mainstays of transactional computing at the heart of Fortune 500 enterprise data centers. Back then, however, these systems were often monolithic and not connected to one another, the way they are today. In the 1990s, scale-out and clustered systems based on Unix and Windows began to surround mainframes in many enterprise data centers, she explained.
"They were all good on their own, but clearly there were some disconnects," Bozman recalled. "Those systems did not easily speak with each other. There were ways you could force it, but it wasn't easy."
But change was inevitable, as Microsoft embraced interoperability with Linux and Unix, and IBM made multi-billion-dollar investments in Linux, which today extends across its product lines. Bozman credits CEOs like Microsoft’s Bill Gates, IBM’s Lou Gerstner, Intel’s Andy Grove and Salesforce’s Marc Benioff, among others, with charting new paths after seeing where the computing world was headed via the Internet. Eric Schmidt, now chairman of Alphabet, helped further system interoperability by pursuing Java ubiquity, on all types of systems, as head of Sun’s Technology Group (STG) in the 1990s.
Internet Standards Help Build Cloud On-Ramp
When the Internet came along, it provided a layer that eased interoperability between most systems, Bozman said. Going from the Internet to virtualized data centers in the 2000s to commercialization of cloud computing in the 2008 time frame were all giant stepping stones that drove companies to modernize their IT.