Huawei Telecom Gear Much More Vulnerable to Hackers Than Rivals’ Equipment – WSJ

A detailed report, prepared by Finite State, a Columbus, Ohio-based cybersecurity firm, concludes that Huawei telecom switching gear is far more vulnerable to hacking than other vendors’ hardware due to firmware flaws and inadvertent “back doors” that were discovered. The report has been circulated widely among cybersecurity experts in the U.S. and UK, and it is considered credible.


“Reminds me of  the 1990’s Microsoft Windows/Internet Explorer Security Issues, Not Stuxnet”

-Mayo615

Source: Huawei Telecom Gear Much More Vulnerable to Hackers Than Rivals’ Equipment, Report Says – WSJ

A detailed report, prepared by Finite State, a Columbus, Ohio-based cybersecurity firm, concludes that Huawei telecom switching gear is far more vulnerable to hacking than other vendors’ hardware due to firmware flaws and inadvertent “back doors” that were discovered. The report has been circulated widely among cybersecurity experts in the U.S. and UK, and it is considered credible. The report stops short of concluding that Huawei deliberately inserted the flaws to enable espionage, as it appears more likely that these are flaws that are due to undetected software development errors. The Trump Administration has nevertheless seized on the report to claim evidence of Chinese espionage intent. The report’s conclusions do offer sound evidence that Huawei gear should not be inserted into telecom systems until these errors are removed.  This reminds me of the time when Microsoft Internet Explorer and Windows were suspected of being serious security risks for having so many security holes.

Huawei Enterprise Network Switch

From the Wall Street Journal:

WASHINGTON—Telecommunications gear made by China’s Huawei Technologies Co. is far more likely to contain flaws that could be leveraged by hackers for malicious use than equipment from rival companies, according to new research by cybersecurity experts that top U.S. officials said appeared credible.

Over half of the nearly 10,000 firmware images encoded into more than 500 variations of enterprise network-equipment devices tested by the researchers contained at least one such exploitable vulnerability, the researchers found. Firmware is the software that powers the hardware components of a computer.

The tests were compiled in a new report that has been submitted in recent weeks to senior officials in multiple government agencies in the U.S. and the U.K., as well as to lawmakers. The report is notable both for its findings and because it is circulating widely among Trump administration officials who said it further validated their policy decisions toward Huawei.

“This report supports our assessment that since 2009, Huawei has maintained covert access to some of the systems it has installed for international customers,” said a White House official who reviewed the findings. “Huawei does not disclose this covert access to customers nor local governments. This covert access enables Huawei to record information and modify databases on those local systems.”

The report, reviewed by The Wall Street Journal, was prepared by Finite State, a Columbus, Ohio-based cybersecurity firm.

While the report documents what it calls extensive cybersecurity flaws found in Huawei gear and a pattern of poor security decisions purportedly made by the firm’s engineers, it stops short of accusing the company of deliberately building weaknesses into its products. It also didn’t directly address U.S. claims that Huawei likely conducts electronic espionage for the Chinese government, which Huawei has long denied.

A Huawei official said the company welcomed independent research that could help improve the security of its products but added he couldn’t comment on specifics in the Finite State report because it wasn’t shared in full with the company.

“Without any details, we cannot comment on the professionalism and robustness of the analysis,” the Huawei official said.

Based in Shenzhen, Huawei is the world’s largest telecommunications equipment provider and a leader in next-generation 5G wireless technology.

Huawei has emerged as a central fixture in the growing rift between the U.S. and China over technology, especially with the approach of 5G cellular technology.

The Commerce Department in May cited national-security concerns when it added the telecommunications giant to its “entity list,” which prevents companies from supplying U.S.-origin technology to Huawei without U.S. government approval.

Finite State Chief Executive Matt Wyckhouse co-founded the firm in 2017, after spending nearly 13 years at nearby Battelle, a private, nonprofit applied-science and technology firm that does work in the private and public sectors.

Mr. Wyckhouse, a computer scientist who worked in Battelle’s national security division handling defense and intelligence-community contracts, said Finite State did the work pro-bono and not on behalf of any government. He also said he felt the best way to make policy makers aware of the issues was to make his firm’s research available to the public. He plans to publish it this week.

“We want 5G to be secure,” Mr. Wyckhouse said.

Finite State said it used proprietary, automated systems to analyze more than 1.5 million unique files embedded within nearly 10,000 firmware images supporting 558 products within Huawei’s enterprise-networking product lines.

The company said the rate of vulnerabilities found in Huawei equipment was far higher than the average found in devices manufactured by its rivals, and that 55% of firmware images tested contained at least one vulnerability—which the authors described as a “potential backdoor”— that could allow an attacker with knowledge of the firmware and a corresponding cryptographic key to log into the device.

The report includes a case study comparing one of Huawei’s high-end network switches against similar devices from Arista Networks andJuniper Networks Inc. It found that Huawei’s device had higher risk factors in six of nine categories, generally by a substantial margin.

“In our experience, across the board, these are the highest numbers we have ever seen,” Mr. Wyckhouse said.

In one instance in the case study, Huawei’s network switch registered a 91% risk percentile for the number of credentials with hard-coded default passwords compared against all of Finite State’s entire firmware data set.

By comparison, the risk level for Arista and Juniper was rated at 0%.

Chris Krebs, the top cybersecurity official at the Department of Homeland Security, said Finite State’s research added to existing concerns about Huawei equipment and the conclusion that the company hasn’t shown the intent or capability to improve its security practices.

“With Huawei having not demonstrated the technical proficiency or the commitment to build, deploy, and maintain trustworthy and secure equipment, magnified by the Chinese government’s potential to influence or compel a company like Huawei to do its bidding, we find it an unacceptable risk to use Huawei equipment today and in the future,” Mr. Krebs said.

White House officials who reviewed the Finite State report said the findings revealed flagrant violations of standard protocols. They said the report’s findings also suggested Huawei may be purposely designing its products to include weaknesses.

For example, some of the vulnerabilities found are well-known cybersecurity problems that aren’t difficult to avoid. Of the devices tested, 29% had at least one default username and password encoded into the firmware which could allow malicious actors easy access to those devices if the credentials were left unchanged, according to the report.

A particularly unusual finding was that security problems became quantifiably worse in at least one instance for users who patched a network switch with an updated version of firmware compared with the two-year-old version being replaced. Patches are intended to reduce cybersecurity weaknesses, but a comparison of the two versions found the newer one performed worse across seven of nine categories measured.

“For years, Huawei has essentially dared the international community to identify the security vulnerabilities that have so often been alleged regarding the use of the company’s products,” said Michael Wessel, a member of the U.S.-China Economic and Security Review Commission, a bipartisan panel that makes recommendations to Congress. “It’s hard to see the range and depth of the vulnerabilities identified by Finite State to be anything other than intentional.”

The U.K.’s National Cyber Security Centre also reviewed the Finite State research, people familiar with the matter said, and found it broadly aligned with the technical analysis in the agency’s own report, published in March. The U.K. report accused Huawei of repeatedly failing to address known security flaws in its products and admonished the firm for failing to demonstrate a commitment to fixing them.

A 2012 U.S. government review of security risks associated with Huawei didn’t find clear evidence that the company was being used by China as a tool for espionage, but concluded its gear presented cybersecurity risks due to the presence of many vulnerabilities that could be leveraged by hackers.

Rep. Mike Gallagher, (R., Wis.), said the report highlights the urgency for members of Congress and others to stop Huawei from taking over the global telecommunications supply chain.

“I’ve long thought we should treat Huawei as an appendage of the Chinese Communist Party,” said Mr. Gallagher, who earlier this year introduced legislation targeting Chinese telecommunications firms. “But even I was taken aback by the scale of the security flaws within Huawei’s network architecture as revealed by the report.”

Integration of AI, IoT and Big Data: The Intelligent Assistant

Five years ago, I wrote a post on this blog disparaging the state of the Internet of Things/home automation market as a “Tower of Proprietary Babble.” Vendors of many different home and industrial product offerings were literally speaking different languages, making their products inoperable with other complementary products from other vendors.  The market was being constrained by its immaturity and a failure to grasp the importance of open standards. A 2017 Verizon report concluded that “an absence of industry-wide standards…represented greater than 50% of executives concerns about IoT. Today I can report that finally, the solutions and technologies are beginning to come together, albeit still slowly. 


The Evolution of These Technologies Is Clearer

The IoT Tower of Proprietary Babble Is Slowly Crumbling

The Rise of the Intelligent Assistant

Five years ago, I wrote a post on this blog disparaging the state of the Internet of Things/home automation market as a “Tower of Proprietary Babble.” Vendors of many different home and industrial product offerings were literally speaking different languages, making their products inoperable with other complementary products from other vendors.  The market was being constrained by its immaturity and a failure to grasp the importance of open standards. A 2017 Verizon report concluded that “an absence of industry-wide standards…represented greater than 50% of executives concerns about IoT.” Today I can report that finally, the solutions and technologies are beginning to come together, albeit still slowly. 

 

One of the most important factors influencing these positive developments has been the recognition of the importance of this technology area by major corporate players and a large number of entrepreneurial companies funded by venture investment, as shown in the infographic above. Amazon, for example, announced in October 2018 that it has shipped over 100 Million Echo devices, which effectively combine an intelligent assistant, smart hub, and a large-scale database of information. This does not take into account the dozens of other companies which have launched their own entries. I like to point to Philips Hue as such an example of corporate strategic focus perhaps changing the future corporate prospects of Philips, based in Eindhoven in the Netherlands. I have visited Philips HQ, a company trying to evolve from the incandescent lighting market. Two years ago my wife bought me a Philips Hue WiFi controlled smart lighting starter kit. My initial reaction was disbelief that it would succeed. I am eating crow on that point, as I now control my lighting using Amazon’s Alexa and the Philips Hue smart hub. The rise of the “intelligent assistant” seems to have been a catalyst for growth and convergence. 

The situation with proprietary silos of offerings that do not work well or at all with other offerings is still frustrating, but slowly evolving. Amazon Firestick’s browser is its own awkward “Silk” or alternatively Firefox, but excluding Google’s Chrome for alleged competitive advantage. When I set up my Firestick, I had to ditch Chromecast because I only have so many HDMI ports. Alexa works with Spotify but only in one room as dictated by Spotify. Alexa can play music from Amazon Music or Sirius/XM on all Echo devices without the Spotify limitation. Which brings me to another point of aggravation: alleged Smart TV’s. Not only are they not truly “smart,” they are proprietary silos of their own, so “intelligent assistant” smart hubs do not work with “smart” TV’s. Samsung, for example, has its own competing intelligent assistant, Bixby, so of course, only Bixby can control a Samsung TV. I watched one of those YouTube DIY videos on how you could make your TV work with Alexa using third-party software and remotes. Trust me, you do not want to go there. But cracks are beginning to appear that may lead to a flood of openness. Samsung just announced at CES that beginning in 2019 its Smart TV’s will work with Amazon Echo and Google Home, and that a later software update will likely enable older Samsung TV’s to work with Echo and Home. However, Bixby will still control the remote.  Other TV’s from manufacturers like Sony and LG have worked with intelligent assistants for some time. 

The rise of an Internet of Everything Everywhere, the recognition of the need for greater data communication bandwidth, and battery-free wireless IoT sensors are heating up R&D labs everywhere. Keep in mind that I am focusing on the consumer side, and have not even mentioned the rising demands from industrial applications.  Intel has estimated that autonomous vehicles will transmit up to 4 Terabytes of data daily. AR and VR applications will require similar throughput. Existing wireless data communication technologies, including 5G LTE, cannot address this need. In addition, an exploding need for IoT sensors not connected to an electrical power source will require more work in the area of “energy harvesting.” Energy harvesting began with passive RFID, and by using kinetic, pizeo, and thermoelectric energy and converting it into a battery-free electrical power source for sensors. EnOcean, an entrepreneurial spinoff of Siemens in Munich has pioneered this technology but it is not sufficient for future market requirements.  

Fortunately, work has already begun on both higher throughput wireless data communication using mmWave spectrum, and energy harvesting using radio backscatter, reminiscent of Nikola Tesla’s dream of wireless electrical power distribution. The successful demonstration of these technologies holds the potential to open the door to new IEEE data communication standards that could potentially play a role in ending the Tower of Babble and accelerating the integration of AI, IoT, and Big Data.  Bottom line is that the market and the technology landscape are improving. 

READ MORE: IEEE Talk: Integrated Big Data, The Cloud, & Smart Mobile: One Big Deal or Not? from David Mayes

My IEEE Talk from 2013 foreshadows the development of current emerging trends in advanced technology, as they appeared at the time. I proposed that in fact, they represent one huge integrated convergence trend that has morphed into something even bigger, and is already having a major impact on the way we live, work, and think. The 2012 Obama campaign’s sophisticated “Dashboard” application is referenced, integrating Big Data, The Cloud, and Smart Mobile was perhaps the most significant example at that time of the combined power of these trends blending into one big thing. 

READ MORE: Blog Post on IoT from July 20, 2013
homeautomation

The term “Internet of Things”  (IoT) is being loosely tossed around in the media.  But what does it mean? It means simply that data communication, like Internet communication, but not necessarily Internet Protocol packets, is emerging for all manner of “things” in the home, in your car, everywhere: light switches, lighting devices, thermostats, door locks, window shades, kitchen appliances, washers & dryers, home audio and video equipment, even pet food dispensers. You get the idea. It has also been called home automation. All of this communication occurs autonomously, without human intervention. The communication can be between and among these devices, so-called machine to machine or M2M communication.  The data communication can also terminate in a compute server where the information can be acted on automatically, or made available to the user to intervene remotely from their smart mobile phone or any other remote Internet-connected device.

Another key concept is the promise of automated energy efficiency, with the introduction of “smart meters” with data communication capability, and also achieved in large commercial structures via the Leadership in Energy & Environmental Design program or LEED.  Some may recall that when Bill Gates built his multi-million dollar mansion on Lake Washington in Seattle, he had “remote control” of his home built into it.  Now, years later, Gates’ original home automation is obsolete.  The dream of home automation has been around for years, with numerous Silicon Valley conferences, and failed startups over the years, and needless to say, home automation went nowhere. But it is this concept of effortless home automation that has been the Holy Grail.

But this is also where the glowing promise of The Internet of Things (IoT) begins to morph into a giant “hairball.”  The term “hairball” was former Sun Microsystems CEO, Scott McNealy‘s favorite term to describe a complicated mess.  In hindsight, the early euphoric days of home automation were plagued by the lack of “convergence.”  I use this term to describe the inability of available technology to meet the market opportunity.  Without convergence, there can be no market opportunity beyond early adopter techno geeks. Today, the convergence problem has finally been eliminated. Moore’s Law and advances in data communication have swept away the convergence problem. But for many years the home automation market was stalled.

Also, as more Internet-connected devices emerged it became apparent that these devices and apps were a hacker’s paradise.  The concept of IoT was being implemented in very naive and immature ways and lacking common industry standards on basic issues: the kinds of things that the IETF and IEEE are famous for.  These vulnerabilities are only now very slowly being resolved, but still in a fragmented ad hoc manner. The central problem has not been addressed due to classic proprietary “not invented here” mindsets.

The problem that is currently the center of this hairball, and from all indications is not likely to be resolved anytime soon.  It is the problem of multiple data communication protocols, many of them effectively proprietary, creating a huge incompatible Tower of Babbling Things.  There is no meaningful industry and market wide consensus on how The Internet of Things should communicate with the rest of the Internet.  Until this happens, there can be no fulfillment of the promise of The Internet of Things. I recently posted Co-opetition: Open Standards Always Win,” which discusses the need for open standards in order for a market to scale up.

Read more: Co-opetition: Open Standards Always Win

A recent ZDNet post explains that home automation currently requires that devices need to be able to connect with “multiple local- and wide-area connectivity options (ZigBee, Wi-Fi, Bluetooth, GSM/GPRS, RFID/NFC, GPS, Ethernet). Along with the ability to connect many different kinds of sensors, this allows devices to be configured for a range of vertical markets.” Huh?  This is the problem in a nutshell. You do not need to be a data communication engineer to get the point.  And this is not even close to a full discussion of the problem.  There are also IoT vendors who believe that consumers should pay them for the ability to connect to their proprietary Cloud. So imagine paying a fee for every protocol or sensor we employ in our homes. That’s a non-starter.

The above laundry list of data communication protocols, does not include the Zigbee “smart meter” communications standards war.  The Zigbee protocol has been around for years, and claims to be an open industry standard, but many do not agree. Zigbee still does not really work, and a new competing smart meter protocol has just entered the picture.  The Bluetooth IEEE 802.15 standard now may be overtaken by a much more powerful 802.15 3a.  Some are asking if 4G LTE, NFC or WiFi may eliminate Bluetooth altogether.   A very cool new technology, energy harvesting, has begun to take off in the home automation market.  The energy harvesting sensors (no batteries) can capture just enough kinetic, peizo or thermoelectric energy to transmit short data communication “telegrams” to an energy harvesting router or server.  The EnOcean Alliance has been formed around a small German company spun off from Siemens, and has attracted many leading companies in building automation. But EnOcean itself has recently published an article in Electronic Design News, announcing that they have a created “middleware” (quote) “…to incorporate battery-less devices into networks based on several different communication standards such as Wi-Fi, GSM, Ethernet/IP, BACnet, LON, KNX or DALI.”  (unquote).  It is apparent that this space remains very confused, crowded and uncertain.  A new Cambridge UK startup, Neul is proposing yet another new IoT approach using the radio spectrum known as “white space,”  becoming available with the transition from analog to digital television.  With this much contention on protocols, there will be nothing but market paralysis.

Is everyone following all of these acronyms and data comm protocols?  There will be a short quiz at the end of this post. (smile)

The advent of IP version 6, strongly supported by Intel and Cisco Systems has created another area of confusion. The problem with IPv6 in the world of The IoT is “too much information” as we say.  Cisco and Intel want to see IPv6 as the one global protocol for every Internet connected device. This is utterly incompatible with energy harvesting, as the tiny amount of harvested energy cannot transmit the very long IPv6 packets. Hence, EnOcean’s middleware, without which their market is essentially constrained.

Then there is the ongoing new standards and upgrade activity in the International Standards Organization (ISO), The Institute of Electrical and Electronics Engineers (IEEE), Special Interest Groups (SIG’s”), none of which seem to be moving toward any ultimate solution to the Tower of Babbling Things problem in The Internet of Things.

The Brave New World of Internet privacy issues relating to this tidal wave of Big Data are not even considered here, and deserve a separate post on the subject.  A recent NBC Technology post has explored many of these issues, while some have suggested we simply need to get over it. We have no privacy.

Read more: Internet of Things pits George Jetson against George Orwell

Stakeholders in The Internet of Things seem not to have learned the repeated lesson of open standards and co-opetition, and are concentrating on proprietary advantage which ensures that this market will not effectively scale anytime in the foreseeable future. Intertwined with the Tower of Babbling Things are the problems of Internet privacy and consumer concerns about wireless communication health & safety issues.  Taken together, this market is not ready for prime time.

 

“Specsmanship”: Missing the Point of a “Complete Product”


The Definition of “Specsmanship”

Wikipedia defines Specsmanship as the inappropriate use of specifications or measurement results to establish the putative superiority of one entity over another, generally when no such superiority exists. It is commonly found in high fidelity audio equipment, automobiles and other apparatus where uneducated users identify some numerical value upon which to base their pride or derision, whether or not it is relevant to the actual use of the device. Smartphones and the early microprocessor market are also examples.

Two Specsmanship Case Studies

Most recently, we are seeing specsmanship in the smartphone market.  As the smartphone market has matured into 7th, 8th, 9th generations of smartphones, the differentiation among products has been reduced to smaller and smaller differences in the products : resolution of the camera, display size or alleged brightness, etc.. In earlier generations, Apple, and the Android phone manufacturers created a highly effective intangible market need to possess their latest generation phone in which features were less important. I called this market need the smartphone “Star Wars” phenomenon causing people to line up around the block as if to see the latest Star Wars film.  Most market analysts now agree that the smartphone market frenzy has run its course. Apple’s strategy to reinvigorate the market by creating a higher price point product has predictably fallen flat. Apple’s move surprised me because the marketers at Apple seemed to miss the consumer market sentiment. Water resistance in my view was the last major device feature with a market need to protect phones from the dreaded “toilet drop.” Samsung introduced water resistance in the 5th generation Galaxy, and permanently in the Galaxy 7. I have not been motivated to buy a new phone since the Galaxy 7.

In another, more dramatic and pivotal example, my first personal experience of the specsmanship phenomenon was at Intel, during the original first generation microprocessor war: the Intel 8086 versus the Motorola 68000. Without diving too deeply into the technical specifications, the Intel 8086 on its face was technically inferior to the Motorola 68000 at a critical time when microprocessors were very new, customers had doubts, and the market was just beginning to establish a foothold in electronics design. Facing this marketing challenge, Intel’s Vice President of Marketing at that time, Bill Davidow, made a momentous decision to “differentiate” Intel and the 8086 not its specifications, but on Intel’s long-term vision for its microprocessor family of products and to focus its marketing efforts on senior management executives of its customers, not the engineers.  Davidow famously delivered a presentation to the Intel sales force, “How To Sell A Dog.” The message was to ignore the spec and concentrate on the customers higher level needs, and the security of an investment in Intel with its long-term vision to provide them with greater value and competitive advantage.

Motorola fatefully decided to concentrate its marketing strategy entirely on the superior technical specifications of the 68000, poignantly winning a small skirmish but losing the war. Intel dominates the general purpose microprocessor market to this day. The Intel versus Motorola story is definitively detailed in Bill Davidow’s now famous book, Marketing High Technology: An Insider’s View. Davidow’s book also includes numerous gems of insight into marketing. Bill’s thoughts on the barriers to a new entrant into an existing market have stuck with me over the years.

If the smartphone market is ever to revive, it needs to learn from Davidow’s lesson, ignore the specs, and concentrate on creating a higher level marketing message that meets deep customer needs.

 

Bill Davidow, former Intel Marketing Vice President

 

 

HBS Professor Ted Levitt’s Total Product Concept And Its Influence On Davidow

Though I have met with Bill Davidow many times, spent time with him, and invited him to speak with executives of an emerging technology company, I have never directly asked him about the degree to which Harvard Professor Ted Levitt’s concept of a Total Product influenced him. It does seem highly likely that it is the case.  By way of example, marketers often refer to “product differentiation.” Specsmanship is the lowest possible form of product differentiation. Creating a higher level of product value is the true essence of product differentiation. This is also the essence of Levitt’s now legendary Total Product. What is different in the Intel case is my memory of how Levitt’s Total Product model, was adapted at Intel. I will explain.

Harvard Business School Professor Ted Levitt

 

READ MORE: Levitt HBR: Marketing Success Through Differentiation of Anything

Levitt’s classic Total Product model is graphically displayed here:

In my personal view and recollection which I show here, I believe Davidow focused on the “Augmented Product,” “Expected Product” and the “Potential Product,” and avoided the “Generic Product” to win the specsmanship war with Motorola. I also distinctly remember a slightly different Intel model which is shown below.

The Intel Variation On The Ted Levitt Total Product Model


It is my recollection that we at Intel, and most likely Bill Davidow in particular, adapted the Ted Levitt model to Intel’s particular new market realities, and focused on the outer circle, “Corporate Vision” and “Product Roadmap” to win the microprocessor war. The “Engineering Deliverable” is not a product. It is only a naked engineering project deliverable. Specsmanship does not make it a product. The “Corporate Vision” and “Product Roadmap” offer greater long-term value to customers, and ultimately create a powerful brand image.

Big Data, Cloud, Smart Mobile And Even AR Morph Into One Mind Boggling Thing


David Mayes

IEEE Talk: Integrated Big Data, The Cloud, & Smart Mobile: Actually One Big Thing

by 

This IEEE Talk discusses the three biggest trends in online technology and proposes that in fact, they represent one huge integrated trend that is already having a major impact on the way we live, work and think. The 2012 Obama Campaign’s Dashboard mobile application, integrating Big Data, The Cloud, and Smart Mobile is perhaps the most significant example of this trend, combining all three technologies into one big thing. A major shakeout and industry consolidation seems inevitable. Additional developments as diverse as augmented reality, the Internet of Things, Smart Grid, near field communication, mobile payment processing, and location-based services are also considered as linked to this overall trend.

IEEE Talk: Integrated Big Data, The Cloud, & Smart Mobile: Big Deal or Not? Presentation Transcript

  • 1. Big Data, The Cloud, & Smart Mobile: Integrated Big Deal or Not? ©David Mayes 1
  • 2. IEEE: UBC Okanagan Wednesday, February 6th, 2013 ©David Mayes 2
  • 3. Speaker Introduction IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 3
  • 4. David Mayes: LinkedIn Profile: http://www.linkedin.com/in/mayo615 Personal Blog: http://mayo615.com UBC Office: EME 4151 (250) 807-9821 / Hours by appt. Email: david.mayes@ubc.ca mayo0615@gmail.com Mobile: (250) 864-9552 Twitter: @mayo615 Experience: Executive management, access to venture capital, International business development, sales & marketing, entrepreneurial mentorship, technology assessment, strategic planning, renewable energy technology. Intel Corporation (US/Europe/Japan), 01 Computers Group (UK) Ltd, Mobile Data International (Canada/Intl.), Silicon Graphics (US), Sun Microsystems (US), Ascend Communications (US/Intl.), P-Cube (US/Israel/Intl.), Global Internet Group LLP (US/Intl.), New Zealand Trade & Enterprise. IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 4
  • 5. Agenda • Some Historical Context • The Emergence of SoMoClo • The Emergence of Big Data • The Emergence of Smart Mobile • The Convergence of ToDaClo • What Do You Think? IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 5
  • 6. Some Historical Context IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 6
  • 7. Canada’s McLuhan: The First Hint “The new electronic interdependence recreates the world in the image of a global village.” Marshall McLuhan, “Gutenberg Galaxy”, 1962, Canadian author, educator, & philosopher (1911 – 1980) IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? Video: The “McLuhan” Scene from Annie Hall © David Mayes 7
  • 8. Stuart Brand, Jobs & Woz: The Whole Earth Catalog IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 8
  • 9. Grove, Noyce and Moore IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? “We had no idea at all that we had turned the first stone on something that was going to be an $80 billion business.” -Gordon Moore ©David Mayes 9
  • 10. Sir Tim Berners-Lee and Vin Cerf IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 10
  • 11. Agenda • Some Historical Context • The Emergence of SoMoClo • The Emergence of Big Data • The Emergence of Smart Mobile • The Convergence of ToDaClo • What Do You Think? IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not?
  • 12. The Emergence of SoMoClo IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? Social + Mobile + Cloud ©David Mayes 12
  • 13. Emergence of Social Media IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 13
  • 14. 2012 Social Media Market Landscape IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 14
  • 15. Emergence of “Cloud Computing” IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 15
  • 16. Emergence of End-user Cloud Apps IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 16
  • 17. 2012 Cloud Enterprise Players IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 17
  • 18. The Key Issue: Data Privacy Reliability, and Security Despite reassurances, there is no permanent solution, no silver bullet. The only solution is to unplug IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 18
  • 19. Recent Cyber Security News: • Google Chairman, Eric Schmidt’s new book on China: • “the world’s most active and enthusiastic filterer of information” as well as “the most sophisticated and prolific” hacker of foreign companies. In a world that is becoming increasingly digital, the willingness of China’s government and state companies to use cyber crime gives the country an economic and political edge. • NY Times, WSJ hacking last week traced to China • Twitter theft of 250K users personal information last week • Sony PlayStation Anonymous hacks (twice in 2 weeks) IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 19
  • 20. Agenda • Some Historical Context • The Emergence of SoMoClo • The Emergence of Big Data • The Emergence of Smart Mobile • The Convergence of ToDaClo • What Do You Think? IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not?
  • 21. The Emergence of “Big Data” IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 21
  • 22. Emergence of “Big Data” • Major advances in scale and sophistication of government intelligence gathering and analysis • Cost no object • NSA PRISM global telecom surveillance programPost 9/11 World IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 22
  • 23. An Interesting Scientific Analogy Chaos, with reference to chaos theory, refers to an apparent lack of order in a system that nevertheless obeys particular laws or rules; this understanding of chaos is synonymous with dynamical instability, a condition discovered by the physicist Henri Poincare in the early 20th century that refers to an inherent lack of predictability in some physical systems. IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 23
  • 24. Key Drivers of the Emergence of Big Data • Moore’s Law – compute cost and power • Design rules, multi-core, 3D design • Massive cost decline in data storage • Emergence of solid state memristor • Google Spanner 1st global real-time database • DARPA “Python” programming language • Data Center data storage accumulation • 2.7 zettabytes currently and growing rapidly • A zettabyte equals 1021 bytes (1000 exabytes) IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 24
  • 25. The Big Data Landscape Today IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 25
  • 26. The Key Issue: Privacy “Get over it! You have no privacy!” Scott McNealy, former CEO of Sun Microsystems IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 26
  • 27. Agenda • Some Historical Context • The Emergence of SoMoClo • The Emergence of Big Data • The Emergence of Smart Mobile • The Convergence of ToDaClo • What Do You Think? IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not?
  • 28. The Emergence of Smart Mobile IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 28
  • 29. Emergence of Smart Mobile IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 29
  • 30. Key Drivers of Smart Mobile • Moore’s Law – compute cost and power • Design rules, multi-core, 3D design • Focus on reducing heat: gate leakage • Intel Atom “all day battery life” is a beginning • Massive cost decline in data storage • Mobile bandwidth:4G/LTE “no cost difference” • “White space” metro Wi-Fi potential maybe • New available spectrum between digital TV channels: increased transmit power • PC market death: Dell Computer & HP IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 30
  • 31. Mobile-based Services • GPS, Cloud, personal and database info on mobile • Geotagging from current location tied to your objective: • Find merchandise, restaurant, bar, etc. • Find and tag people • Find people with similar interests nearby • The rise of the mobile gaming market • Already well-established in Hong Kong, Seoul • North America far behind Asian telecom markets • Facebook has just announced LBS plans • The downside: battery drain issue still critical • “People want their phones to do too much” • 4G LTE, Wifi, Bluetooth, GPS, Streaming, Mobile Gaming IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 31
  • 32. Location-based Services Landscape IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 32
  • 33. Agenda • Some Historical Context • The Emergence of SoMoClo • The Emergence of Big Data • The Emergence of Smart Mobile • The Convergence of ToDaClo • What Do You Think? IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not?
  • 34. The Convergence of “ToDaClo” Touch + Data + Cloud IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 34
  • 35. David Mayes ‹#›
  • 36. Agenda • Some Historical Context • The Emergence of SoMoClo • The Emergence of Big Data • The Emergence of Smart Mobile • The Convergence of ToDaClo • What Do You Think? IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not?
  • 37. Discussion: Big Data, The Cloud, and Smart Mobile, Big Deal or Not? IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 37
  • 38. My Key Takeaway Points • Even from the 50,000 foot level, a shakeout and consolidation seem inevitable • A lot of people are going to lose a lot of money • There will be “snake oil” sold that does not work • Nevertheless these three new markets are actually one unified market, and likely: The Next Big Thing IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 38
  • 39. What Do You Think? • No. ToDaClo is mostly media hype, and not a “Big Deal.” • I’m skeptical. ToDaClo will probably be a “Big Deal,” but I haven’t seen much yet • Maybe. I do not know yet whether ToDaClo will be a Big Deal • Yes. ToDaClo is a Big Deal and it is already changing our lives IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 39
  • 40. Thank You! IEEE UBC Okanagan Big Data, The Cloud, and Smart Mobile: Big Deal or Not? ©David Mayes 40
  • 41. ©David Mayes 41

 

The Internet of Things: The Promise Versus the Tower of Hacked Babbling Things


homeautomation

The term “Internet of Things”  (IoT) is being loosely tossed around in the media.  But what does it mean? It means simply that data communication, like Internet communication, but not necessarily Internet Protocol packets, is emerging for all manner of “things” in the home, in your car, everywhere: light switches, lighting devices, thermostats, door locks, window shades, kitchen appliances, washers & dryers, home audio and video equipment, even pet food dispensers. You get the idea. It has also been called home automation. All of this communication occurs autonomously, without human intervention. The communication can be between and among these devices, so called machine to machine or M2M communication.  The data communication can also terminate in a compute server where the information can be acted on automatically, or made available to the user to intervene remotely from their smart mobile phone or any other remote Internet connected device.

Another key concept is the promise of automated energy efficiency, with the introduction of “smart meters” with data communication capability, and also achieved in large commercial structures via the Leadership in Energy & Environmental Design program or LEED.  Some may recall that when Bill Gates built his multi-million dollar mansion on Lake Washington in Seattle, he had “remote control” of his home built into it.  Now, years later, Gates’ original home automation is obsolete.  The dream of home automation has been around for years, with numerous Silicon Valley conferences, and failed startups over the years, and needless to say, home automation went nowhere. But it is this concept of effortless home automation that has been the Holy Grail.

But this is also where the glowing promise of The Internet of Things (IoT) begins to morph into a giant “hairball.”  The term “hairball” was former Sun Microsystems CEO, Scott McNealy‘s favorite term to describe a complicated mess.  In hindsight, the early euphoric days of home automation were plagued by the lack of “convergence.”  I use this term to describe the inability of available technology to meet the market opportunity.  Without convergence there can be no market opportunity beyond early adopter techno geeks. Today, the convergence problem has finally been eliminated. Moore’s Law and advances in data communication have swept away the convergence problem. But for many years the home automation market was stalled.

Also, as more Internet-connected devices emerged it became apparent that these devices and apps were a hacker’s paradise.  The concept of IoT was being implemented in very naive and immature ways and lacking common industry standards on basic issues: the kinds of things that the IETF and IEEE are famous for.  These vulnerabilities are only now very slowly being resolved, but still in a fragmented ad hoc manner. The central problem has not been addressed due to classic proprietary “not invented here” mindsets.

The problem that is currently the center of this hairball, and from all indications is not likely to be resolved anytime soon.  It is the problem of multiple data communication protocols, many of them effectively proprietary, creating a huge incompatible Tower of Babbling Things.  There is no meaningful industry and market wide consensus on how The Internet of Things should communicate with the rest of the Internet.  Until this happens, there can be no fulfillment of the promise of The Internet of Things. I recently posted Co-opetition: Open Standards Always Win,” which discusses the need for open standards in order for a market to scale up.

Read more: Co-opetition: Open Standards Always Win

A recent ZDNet post explains that home automation currently requires that devices need to be able to connect with “multiple local- and wide-area connectivity options (ZigBee, Wi-Fi, Bluetooth, GSM/GPRS, RFID/NFC, GPS, Ethernet). Along with the ability to connect many different kinds of sensors, this allows devices to be configured for a range of vertical markets.” Huh?  This is the problem in a nutshell. You do not need to be a data communication engineer to get the point.  And this is not even close to a full discussion of the problem.  There are also IoT vendors who believe that consumers should pay them for the ability to connect to their proprietary Cloud. So imagine paying a fee for every protocol or sensor we employ in our homes. That’s a non-starter.

The above laundry list of data communication protocols, does not include the Zigbee “smart meter” communications standards war.  The Zigbee protocol has been around for years, and claims to be an open industry standard, but many do not agree. Zigbee still does not really work, and a new competing smart meter protocol has just entered the picture.  The Bluetooth IEEE 802.15 standard now may be overtaken by a much more powerful 802.15 3a.  Some are asking if 4G LTE, NFC or WiFi may eliminate Bluetooth altogether.   A very cool new technology, energy harvesting, has begun to take off in the home automation market.  The energy harvesting sensors (no batteries) can capture just enough kinetic, peizo or thermoelectric energy to transmit short data communication “telegrams” to an energy harvesting router or server.  The EnOcean Alliance has been formed around a small German company spun off from Siemens, and has attracted many leading companies in building automation. But EnOcean itself has recently published an article in Electronic Design News, announcing that they have a created “middleware” (quote) “…to incorporate battery-less devices into networks based on several different communication standards such as Wi-Fi, GSM, Ethernet/IP, BACnet, LON, KNX or DALI.”  (unquote).  It is apparent that this space remains very confused, crowded and uncertain.  A new Cambridge UK startup, Neul is proposing yet another new IoT approach using the radio spectrum known as “white space,”  becoming available with the transition from analog to digital television.  With this much contention on protocols, there will be nothing but market paralysis.

Is everyone following all of these acronyms and data comm protocols?  There will be a short quiz at the end of this post. (smile)

The advent of IP version 6, strongly supported by Intel and Cisco Systems has created another area of confusion. The problem with IPv6 in the world of The IoT is “too much information” as we say.  Cisco and Intel want to see IPv6 as the one global protocol for every Internet connected device. This is utterly incompatible with energy harvesting, as the tiny amount of harvested energy cannot transmit the very long IPv6 packets. Hence, EnOcean’s middleware, without which their market is essentially constrained.

Then there is the ongoing new standards and upgrade activity in the International Standards Organization (ISO), The Institute of Electrical and Electronics Engineers (IEEE), Special Interest Groups (SIG’s”), none of which seem to be moving toward any ultimate solution to the Tower of Babbling Things problem in The Internet of Things.

The Brave New World of Internet privacy issues relating to this tidal wave of Big Data are not even considered here, and deserve a separate post on the subject.  A recent NBC Technology post has explored many of these issues, while some have suggested we simply need to get over it. We have no privacy.

Read more: Internet of Things pits George Jetson against George Orwell

Stakeholders in The Internet of Things seem not to have learned the repeated lesson of open standards and co-opetition, and are concentrating on proprietary advantage which ensures that this market will not effectively scale anytime in the foreseeable future. Intertwined with the Tower of Babbling Things are the problems of Internet privacy and consumer concerns about wireless communication health & safety issues.  Taken together, this market is not ready for prime time.

 

WCW III: World Chip War III

After something of a long hiatus, we have an emerging epic World Chip War Three, which is being fought over “CODECS,” and related chips which power our smartphones. Not that the semiconductor industry hasn’t been innovating and evolving, but this is something much bigger. Today’s news about Broadcom’s bid for Qualcomm omits the other crucial player in this new War of Titans, Intel, which has risen from earlier ignominious failures to become the third player in WCW III.


 Intel: The Missing Piece In The Epic New Global Microchip Battle

In the beginning, in the early 1970’s there were the original semiconductor companies like Intel, AMD, Motorola, and not far behind, the Japanese giants NEC, Fujitsu, and Mitsubishi. The first great Chip War was in memory chips, primarily as replacements for magnetic core memory and for the emerging new minicomputer industry. The Japanese fought World Chip War One as a nation, using the power and influence of its entire government to compete against the American companies. At the behest of the U.S. government itself, IBM bought a minority share in Intel to potentially defend Intel against any hostile bid from the Japanese.  Not long afterward, the Great Microprocessor War, World Chip War Two exploded, primarily between Intel and Motorola. Intel was the victor of World Chip War Two, primarily due to the extraordinary marketing genius of Intel Marketing VP Bill Davidow’s “Crush” campaign, not superior Intel technology. It was a huge lesson of the importance of marketing over having the “coolest technology.”  Now after something of a long hiatus, we have World Chip War Three, which is being fought over “CODECS,” and related chips which power our smartphones. Today’s news about Broadcom’s bid for Qualcomm omits the other crucial player in this new War of Titans, Intel, which has risen from earlier ignominious failures to become the third player in WCW III.

Broadcom’s Bid For Qualcomm Marks Upheaval in Chip Industry

The California-based chip maker offered made an unsolicited $105 billion takeover bid for Qualcomm

Broadcom proposed to acquire rival chip maker Qualcomm for $70 per share.
Broadcom proposed to acquire rival chip maker Qualcomm for $70 per share. PHOTO: MIKE BLAKE/REUTERS

Broadcom Ltd. AVGO 1.42% made an unsolicited $105 billion takeover bid for QualcommInc., QCOM 1.15% the chip industry’s boldest bet yet that size will equal strength at a time of technological upheaval.

The approach, which would mark the biggest technology takeover ever, shows how tech companies are positioning themselves for a world where a range of chip-driven devices—from phones to cars to factory robots—are transmitting, receiving and processing evermore information. Broadcom Chief Executive Hock Tan already has used acquisitions to build the company into the fourth-biggest chip maker by market value, part of a wave of industry consolidation as profits on some chips, such as those used in personal computers, are squeezed by sluggish sales and rising costs.

A combination with Qualcomm would create a behemoth whose chips manage communications among consumer devices and appliances, phone service providers, and data centers that are becoming the workhorses in artificial intelligence.

The deal is far from certain. San Diego-based Qualcomm, which said it would consider the proposal, is expected ultimately to rebuff it on the grounds that the price isn’t high enough, especially given the significant risk that regulators would block it, according to some analysts. Under typical circumstances, unfriendly bids like this are difficult to pull off; given the sheer size and complexity of Qualcomm, this one could be especially challenging, analysts said Monday.

Broadcom’s preference is to strike a friendly deal, but if it fails to do so, it would consider nominating Qualcomm directors who may be more amenable to a transaction, a person familiar with the matter said. The nomination deadline is Dec. 8 and the annual meeting at which the director vote would take place is likely be around March.

Broadcom offered $70 a share for Qualcomm, representing a 28% premium from its closing price on Thursday—before news reports on the expected approach.

Qualcomm shares ended trading Monday up 1.2% to $62.52, while Broadcom shares were 1.4% higher at $277.52.

Mr. Tan said he has been talking with Qualcomm for over a year about a possible tie-up. “Our strategy has been consistent,” Mr. Tan said in an interview. “When a business is No. 1 in technology and No. 1 in market position, we acquire it and put it on our Broadcom platform and grow through that strategy. Qualcomm has a very large sustainable franchise that meets those criteria.”

Should the deal be completed, Broadcom would take on Qualcomm’s leadership in developing the next wave of cellular technology, known as 5G, which is expected to roll out over the coming two years. That could give Broadcom a new growth engine, as 5G is expected to dramatically accelerate the speed and responsiveness of cellular communications necessary for applications like self-driving cars.

Broadcom was formed when Avago Technologies Ltd. bought the former Broadcom in 2015 for $39 billion and kept the name, and Mr. Tan has continued growing by acquisition. The company sells a diverse line of equipment for networking and communications. Its products include chips for Wi-Fi and Bluetooth technology that connect devices that are closer together—technologies that some analysts say are likely to grow less quickly than 5G.

“People will continue to use short-proximity wireless like Wi-Fi and Bluetooth, but the growth and money is clearly in 5G,” said analyst Patrick Moorhead of Moor Insights & Strategy.

Overall, Broadcom and Qualcomm have largely complementary product lines. But the possible Broadcom takeover is likely to face intense regulatory scrutiny, given the companies’ combined scale and the fact that they are both leaders in Wi-Fi and Bluetooth technology. The companies share customers including Apple Inc., whose iPhones and iPads include components from both Qualcomm and Broadcom.

Qualcomm already has been under pressure from antitrust agencies in several jurisdictions, including the U.S. The company has paid hefty regulatory fines in China, South Korea and Taiwan.

Qualcomm was riding high as recently as a year ago after unveiling the chip industry’s largest-ever acquisition: a $39 billion proposed deal for NXP Semiconductors NV. The deal hasn’t closed yet, and Broadcom said Monday that its proposal would stand regardless of whether Qualcomm’s proposed acquisition of NXP is consummated under the current terms.

Since then, a string of hits by regulators, competitors, and customers including Apple has left the industry titan in a vulnerable position. Qualcomm’s profit in the fiscal year that ended Sept. 24 plummeted 57%, and its share price declined 18% in the 12 months through Thursday’s close compared with a 58% rise in the PHLX Semiconductor Sector Index. That was before news of Broadcom’s interest sent Qualcomm shares up nearly 13% on Friday.

Funding for the deal would come in the form of loans from a gaggle of banks, with additional cash from Silver Lake Management LLC. The private-equity firm, which already owns a stake in Broadcom, provided a commitment letter for $5 billion in convertible debt. Silver Lake said a substantial portion of that capital would come in the form of an equity investment from its Silver Lake Partners fund, with the remainder from other sources.

The equity contribution would be the single largest in the history of the firm, exceeding the roughly $1 billion it invested in the merger of Dell Inc. and EMC Corp.

Broadcom’s bid came days after the Singapore-based company announced plans to relocate its headquarters to the U.S., a move that could make it easier to pursue acquisitions of U.S. targets.

Broadcom’s earlier $5.5 billion offer to buy Brocade Communication Systems, based in San Jose, Calif., has been delayed due to a review by the Committee on Foreign Investment in the United States, which reviews international deals that raise concerns about national security.

Any deal to acquire Qualcomm would also receive close scrutiny, experts say. “Anything that has the word semiconductor in it gets rapt attention from CFIUS,” said James Lewis of the Center for Strategic and International Studies, a policy think tank. “The move to the U.S. is an effort to tamp down CFIUS concerns.”

Google’s Quantum Dream May Be Just Around The Corner

In 1981, Richard Feynman, probably the most famous physicist of his time asked the question: “Can we simulate physics on a computer?” At the time the answer was “theoretically yes,” but practically not at that time. Today, we may be on the verge of answering “yes” in practice to Feynman’s original question. Quantum computers operate in such a strange way and are so radically different from today’s computers that it requires some understanding of quantum mechanics and bizarre properties like “quantum entanglement.” Quantum computers are in a realm orders of magnitude beyond today’s supercomputers and their application in specific computational problems like cryptography, Big Data analysis, computational fluid dynamics (CFD), and sub-atomic physics will change our World. Canadian quantum computing company, D-Wave Systems has been at the center of Google’s efforts to pioneer this technology.


In 1981, Richard Feynman, probably the most famous physicist of his time asked the question: “Can we simulate physics on a computer?” At the time the answer was “theoretically yes,” but practically not at that time. Today, we may be on the verge of answering “yes” in practice to Feynman’s original question. Quantum computers operate in such a strange way and are so radically different from today’s computers that it requires some understanding of quantum mechanics and bizarre properties like “quantum entanglement.” Quantum computers are in a realm orders of magnitude beyond today’s supercomputers. Their application in specific computational problems like cryptography, Big Data analysis, computational fluid dynamics (CFD), and sub-atomic physics will change our World. Canadian quantum computing company, D-Wave Systems has been at the center of Google’s efforts to pioneer this technology.

Reblogged from New Scientist

Google’s Quantum Dream May Be Just Around the Corner

 QUANTUM-articleLarge-v2

31 August 2016

Revealed: Google’s plan for quantum computer supremacy

The field of quantum computing is undergoing a rapid shake-up, and engineers at Google have quietly set out a plan to dominate

SOMEWHERE in California, Google is building a device that will usher in a new era for computing. It’s a quantum computer, the largest ever made, designed to prove once and for all that machines exploiting exotic physics can outperform the world’s top supercomputers.

And New Scientist has learned it could be ready sooner than anyone expected – perhaps even by the end of next year.

The quantum computing revolution has been a long time coming. In the 1980s, theorists realised that a computer based on quantum mechanics had the potential to vastly outperform ordinary, or classical, computers at certain tasks. But building one was another matter. Only recently has a quantum computer that can beat a classical one gone from a lab curiosity to something that could actually happen. Google wants to create the first.

The firm’s plans are secretive, and Google declined to comment for this article. But researchers contacted by New Scientist all believe it is on the cusp of a breakthrough, following presentations at conferences and private meetings.

“They are definitely the world leaders now, there is no doubt about it,” says Simon Devitt at the RIKEN Center for Emergent Matter Science in Japan. “It’s Google’s to lose. If Google’s not the group that does it, then something has gone wrong.”

We have had a glimpse of Google’s intentions. Last month, its engineers quietly published a paper detailing their plans (arxiv.org/abs/1608.00263). Their goal, audaciously named quantum supremacy, is to build the first quantum computer capable of performing a task no classical computer can.

“It’s a blueprint for what they’re planning to do in the next couple of years,” says Scott Aaronson at the University of Texas at Austin, who has discussed the plans with the team.

So how will they do it? Quantum computers process data as quantum bits, or qubits. Unlike classical bits, these can store a mixture of both 0 and 1 at the same time, thanks to the principle of quantum superposition. It’s this potential that gives quantum computers the edge at certain problems, like factoring large numbers. But ordinary computers are also pretty good at such tasks. Showing quantum computers are better would require thousands of qubits, which is far beyond our current technical ability.

Instead, Google wants to claim the prize with just 50 qubits. That’s still an ambitious goal – publicly, they have only announced a 9-qubit computer – but one within reach.

“It’s Google’s to lose. If Google’s not the group that does it, then something has gone wrong“

To help it succeed, Google has brought the fight to quantum’s home turf. It is focusing on a problem that is fiendishly difficult for ordinary computers but that a quantum computer will do naturally: simulating the behaviour of a random arrangement of quantum circuits.

Any small variation in the input into those quantum circuits can produce a massively different output, so it’s difficult for the classical computer to cheat with approximations to simplify the problem. “They’re doing a quantum version of chaos,” says Devitt. “The output is essentially random, so you have to compute everything.”

To push classical computing to the limit, Google turned to Edison, one of the most advanced supercomputers in the world, housed at the US National Energy Research Scientific Computing Center. Google had it simulate the behaviour of quantum circuits on increasingly larger grids of qubits, up to a 6 × 7 grid of 42 qubits.

This computation is difficult because as the grid size increases, the amount of memory needed to store everything balloons rapidly. A 6 × 4 grid needed just 268 megabytes, less than found in your average smartphone. The 6 × 7 grid demanded 70 terabytes, roughly 10,000 times that of a high-end PC.

Google stopped there because going to the next size up is currently impossible: a 48-qubit grid would require 2.252 petabytes of memory, almost double that of the top supercomputer in the world. If Google can solve the problem with a 50-qubit quantum computer, it will have beaten every other computer in existence.

Eyes on the prize

By setting out this clear test, Google hopes to avoid the problems that have plagued previous claims of quantum computers outperforming ordinary ones – including some made by Google.

Last year, the firm announced it had solved certain problems 100 million times faster than a classical computer by using a D-Wave quantum computer, a commercially available device with a controversial history. Experts immediately dismissed the results, saying they weren’t a fair comparison.

Google purchased its D-Wave computer in 2013 to figure out whether it could be used to improve search results and artificial intelligence. The following year, the firm hired John Martinis at the University of California, Santa Barbara, to design its own superconducting qubits. “His qubits are way higher quality,” says Aaronson.

It’s Martinis and colleagues who are now attempting to achieve quantum supremacy with 50 qubits, and many believe they will get there soon. “I think this is achievable within two or three years,” says Matthias Troyer at the Swiss Federal Institute of Technology in Zurich. “They’ve showed concrete steps on how they will do it.”

Martinis and colleagues have discussed a number of timelines for reaching this milestone, says Devitt. The earliest is by the end of this year, but that is unlikely. “I’m going to be optimistic and say maybe at the end of next year,” he says. “If they get it done even within the next five years, that will be a tremendous leap forward.”

The first successful quantum supremacy experiment won’t give us computers capable of solving any problem imaginable – based on current theory, those will need to be much larger machines. But having a working, small computer could drive innovation, or augment existing computers, making it the start of a new era.

Aaronson compares it to the first self-sustaining nuclear reaction, achieved by the Manhattan project in Chicago in 1942. “It might be a thing that causes people to say, if we want a full-scalable quantum computer, let’s talk numbers: how many billions of dollars?” he says.

Solving the challenges of building a 50-qubit device will prepare Google to construct something bigger. “It’s absolutely progress to building a fully scalable machine,” says Ian Walmsley at the University of Oxford.

For quantum computers to be truly useful in the long run, we will also need robust quantum error correction, a technique to mitigate the fragility of quantum states. Martinis and others are already working on this, but it will take longer than achieving quantum supremacy.

Still, achieving supremacy won’t be dismissed.

“Once a system hits quantum supremacy and is showing clear scale-up behaviour, it will be a flare in the sky to the private sector,” says Devitt. “It’s ready to move out of the labs.”

“The field is moving much faster than expected,” says Troyer. “It’s time to move quantum computing from science to engineering and really build devices.”

D-Wave Quantum Machine Tested by NASA and Google Shows Promise


Researchers from Google’s AI Lab say a controversial quantum machine that it and NASA have been testing since 2013 resoundingly beat a conventional computer in a series of tests.

Source: Controversial Quantum Machine Tested by NASA and Google Shows Promise | MIT Technology Review

Inside this box is a superconducting chip, cooled to within a fraction of a degree of absolute zero, that might put new power behind artificial-intelligence software.

Google says it has proof that a controversial machine it bought in 2013 really can use quantum physics to work through a type of math that’s crucial to artificial intelligence much faster than a conventional computer.

Governments and leading computing companies such as Microsoft, IBM, and Google are trying to develop what are called quantum computers because using the weirdness of quantum mechanics to represent data should unlock immense data-crunching powers. Computing giants believe quantum computers could make their artificial-intelligence software much more powerful and unlock scientific leaps in areas like materials science. NASA hopes quantum computers could help schedule rocket launches and simulate future missions and spacecraft. “It is a truly disruptive technology that could change how we do everything,” said Rupak Biswas, director of exploration technology at NASA’s Ames Research Center in Mountain View, California.

Biswas spoke at a media briefing at the research center about the agency’s work with Google on a machine the search giant bought in 2013 from Canadian startup D-Wave systems, which is marketed as “the world’s first commercial quantum computer.” The computer is installed at NASA’s Ames Research Center in Mountain View, California, and operates on data using a superconducting chip called a quantum annealer. A quantum annealer is hard-coded with an algorithm suited to what are called “optimization problems,” which are common in machine-learning and artificial-intelligence software.

However, D-Wave’s chips are controversial among quantum physicists. Researchers inside and outside the company have been unable to conclusively prove that the devices can tap into quantum physics to beat out conventional computers.

Hartmut Neven, leader of Google’s Quantum AI Lab in Los Angeles, said today that his researchers have delivered some firm proof of that. They set up a series of races between the D-Wave computer installed at NASA against a conventional computer with a single processor. “For a specific, carefully crafted proof-of-concept problem we achieve a 100-million-fold speed-up,” said Neven.

Google posted a research paper describing its results online last night, but it has not been formally peer-reviewed. Neven said that journal publications would be forthcoming.

Google’s results are striking—but even if verified, they would only represent partial vindication for D-Wave. The computer that lost in the contest with the quantum machine was running code that had it solve the problem at hand using an algorithm similar to the one baked into the D-Wave chip. An alternative algorithm is known that could have let the conventional computer be more competitive, or even win, by exploiting what Neven called a “bug” in D-Wave’s design. Neven said the test his group staged is still important because that shortcut won’t be available to regular computers when they compete with future quantum annealers capable of working on larger amounts of data.

Matthias Troyer, a physics professor at the Swiss Federal Institute of Technology, Zurich, said making that come true is crucial if chips like D-Wave’s are to become useful. “It will be important to explore if there are problems where quantum annealing has advantages over even the best classical algorithms, and to find if there are classes of application problems where such advantages can be realized,” he said, in a statement with two colleagues.

Last year Troyer’s group published a high-profile study of an earlier D-Wave chip that concluded it didn’t offer advantages over conventional machines. That question has now been partially resolved, they say. “Google’s results indeed show a huge advantage on these carefully chosen instances.”

Google is competing with D-Wave to make a quantum annealer that could do useful work. Last summer the Silicon Valley giant opened a new lab in Santa Barbara, headed by a leading academic researcher, John Martinis (see “Google Launches Effort to Build Its Own Quantum Computer”).

Martinis is also working on quantum hardware that would not be limited to optimization problems, as annealers are. A universal quantum computer, as such a machine would be called, could be programmed to take on any problem and would be much more useful but is expected to take longer to perfect. Government and university labs, Microsoft (see “Microsoft’s Quantum Mechanics”), and IBM (see “IBM Shows Off a Quantum Computing Chip”) are also working on that technology.

John Giannandrea, a VP of engineering at Google who coördinates the company’s research, said that if quantum annealers could be made practical, they would find many uses powering up Google’s machine-learning software. “We’ve already encountered problems in the course of our products impractical to solve with existing computers, and we have a lot of computers,” he said. However, Giannandrea noted, “it may be several years before this research makes a difference to Google products.”

Moore’s Law at 50: At Least A Decade More To Go And Why That’s Important

Gordon Moore, now 86, is still spry and still given to the dry sense of humor for which he has always been known. In an Intel interview this year he said that he had Googled “Moore’s Law” and “Murphy’s Law,” and Moore’s beat Murphy’s by two to one,” demonstrating how ubiquitous is the usage of Dr. Moore’s observation. This week we are commemorating the 50th anniversary of the April 19, 1965 issue of Electronics magazine, in which Dr. Moore first described his vision of doubling the number of transistors on a chip every year or so.


Gordon Moore, now 86, is still spry and still given to the dry sense of humor for which he has always been known.  In an Intel interview this year he said that he had Googled “Moore’s Law” and “Murphy’s Law,” and Moore’s beat Murphy’s by two to one,” demonstrating how ubiquitous is the usage of Dr. Moore’s observation. This week we are commemorating the 50th anniversary of the April 19, 1965 issue of Electronics magazine, in which Dr. Moore first described his vision of doubling the number of transistors on a chip every year or so.

mooreslaw

It may seem geeky to be interested in the details of 14 nanometer (billionth of a meter) integrated circuit design rules, 7 nanometer FinFET (transistor) widths, or 5 nanometer line wire widths, but the fact of matter is that these arcane topics are driving the future of technology applications, telecommunications, business and economic productivity.  As just one example, this week’s top telecommunications business news is the proposed merger of Nokia and Alcatel-Lucent, with the vision to deploy a 5 G (fifth generation) LTE (long term evolution) mobile telephony network. Building out such a high speed voice and data network is almost entirely dependent on the power of the microprocessors in the system and ultimately Moore’s Law.  Nokia apparently believes that it can deploy this technology sooner rather than later and essentially leap frog the competition.  My UBC Management students will recall that in my first university teaching experience in Industry Analysis, I chose to expose them to the semiconductor industry for this exact reason.  Semiconductors are in virtually every electrical device we use on a daily basis.

However, as we cross this milestone we are able to see that we are near the limits of the physics of Moore’s Law.  International Business Strategies, a Los Gatos based consulting firm, estimates that only a decade ago, it cost only $16 million to design and test a new very large scale integated circuit (VLSI), but that today the design and testing cost has skyrocketed to $132 million.  Keep in mind that the cost of design, fabrication and testing of bleeding edge IC’s has been reduced dramatically over the decades by automation, also driven by Moore’s Law. So we are seeing a horizon line.  That said, entirely new technologies are already in the laboratories and may, in a way,  extend Moore’s Law, and the dramatic improvements in cost and productivity that come with it, but through entirely new and different means.

 

 

What are the historical shrines of Silicon Valley?

The answers to this question make a great tour of Silicon Valley history. I added my own answer: the historic bronze plaque commemorating Bob Noyce’s invention of the integrated circuit. It is outside the front of the old Fairchild Semiconductor building, at the corner of Ararstradero Road and Charleston Road, and is almost completely forgotten. Probably the most important invention in our generation. Like so much of Silicon Valley, it is very difficult to easily visit the most important sites or get any sense of their significance. But this list is very good. The historical significance of some of these places will be instantly obvious, others less so. They are all important, so it’s your homework assignment.

i.e. the places of great historical significance to the technology industry … HP Garage, Googleplex, Shockley Semiconductor office, etc.


The answers to this question make a great tour of Silicon Valley history. I added my own answer: the historic bronze plaque commemorating Bob Noyce’s invention of the integrated circuit. It is outside the front of the old Fairchild Semiconductor building, at the corner of Ararstradero Road and Charleston Road, and is almost completely forgotten. Probably the most important invention in our generation. Like so much of Silicon Valley, it is very difficult to easily visit the most important sites or get any sense of their significance. But this list is very good.

The historical significance of some of these places will be instantly obvious, others less so. They are all important, so it’s your homework assignment.

i.e. the places of great historical significance to the technology industry … HP Garage, Googleplex, Shockley Semiconductor office, etc.

The Top 20 to my mindonScaruffi’s list include these (using his words here):

  1. Stanford’s building 50, where the Physics Dept was (1891),
    next to the Memorial Church in the “quadrangle”
  2. Stanford’s “Engineering Corner”, where Fred
    Terman used to work (1902)

  3. The site where in 1909 Charles Herrold established the
    first radio broadcasting station in the world: Fairmont Tower, 50 W. San
    Fernando St & First St, San Jose
  4. The site of the laboratory and factory of Federal
    Telegraph Company (1911), where Lee de Forest worked: 913 Emerson St &
    Channing Ave, Palo Alto
  5. Philo Farnsworth’s laboratory (1927), where television was
    invented: 202 Green Street & Sansome, San Francisco
  6. Fisher Research Laboratories (1931) was based in this
    house: 1505 Byron St, Palo Alto

  7. Hewlett-Packard’s garage (1937), where William Hewlett and
    David Packard started their business: 367 Addison Avenue, Palo Alto
  8. U.C. Berkeley campus and Lawrence Berkeley National Lab
  9. The location of Hewlett-Packard’s first building (1942):
    395 Page Mill Road, Palo Alto
  10. Ampex’s original building (1944): 1313 Laurel St., San
    Carlos
  11. The street where Varian (1948) was started: Washington St,
    San Carlos
  12. IBM’s Western Lab (1952), where the Random Access Method
    of Accounting and Control (RAMAC) was built: 99 Notre Dame Street, San Jose
  13. Shockley’s Laboratory (1956): 391 San Antonio Road,
    Mountain View
  14. NASA Ames (1958): Moffett Blvd./NASA Parkway, Mountain
    View
  15. Fairchild (1959), the site where Robert Noyce and others
    co-invented the integrated circuit: 844 E Charleston Rd, Palo Alto
  16. The building that became the corporate headquarters when
    HP moved to the Stanford Industrial Park (1960) and then HP Labs (1966): 1501
    Page Mill Road, Palo Alto
  17. Venture capital’s headquarters in Menlo Park (1969): 3000
    Sand Hill Rd, Menlo Park
  18. Kleiner Perkins Caufield Byers, where Genentech’s first
    office (1975) was located and where countless start-ups were funded: 2750 Sand
    Hill Road, Menlo Park
  19. Xerox PARC (1970): 3333 Coyote Hill Road, Palo Alto
  20. Four Phase Systems, which started out in a former
    dentist’s office (1969): 991 Commercial St, Palo Alto

View Question on Quora