Yesterday’s Internet Outage In Parts of U.S. and Canada You Didn’t Hear About

A year ago, a DDoS attack caused internet outages around the US by targeting the internet-infrastructure company Dyn, which provides Domain Name System services to look up web servers. Monday saw a nationwide series of outages as well, but with a more pedestrian cause: a misconfiguration at Level 3, an internet backbone company—and enterprise ISP—that underpins other big networks. Network analysts say that the misconfiguration was a routing issue that created a ripple effect, causing problems for companies like Comcast, Spectrum, Verizon, Cox, and RCN across the country.


How a Tiny Error Shut Off the Internet for Parts of the US and Canada

Lily Hay Newman

a group of computer equipment

© Joe Raedle

A year ago, a DDoS attack caused internet outages around the US by targeting the internet-infrastructure company Dyn, which provides Domain Name System services to look up web servers. Monday saw a nationwide series of outages as well, but with a more pedestrian cause: a misconfiguration at Level 3, an internet backbone company—and enterprise ISP—that underpins other big networks. Network analysts say that the misconfiguration was a routing issue that created a ripple effect, causing problems for companies like Comcast, Spectrum, Verizon, Cox, and RCN across the country.

Level 3, whose acquisition by CenturyLink closed recently, said in a statement to WIRED that it resolved the issue in about 90 minutes. “Our network experienced a service disruption affecting some customers with IP-based services,” the company said. “The disruption was caused by a configuration error.” Comcast users started reporting internet outages around the time of the Level 3 outages on Monday, but the company said that it was monitoring “an external network issue” and not a problem with its own infrastructure. RCN confirmed that it had some network problems on Monday because of Level 3. The company said it had restored RCN service by rerouting traffic to a different backbone.

a close up of a map 

© Downdetector.com 

The misconfiguration was a “route leak,” according to Roland Dobbins, a principal engineer at the DDoS and network-security firm Arbor Networks, which monitors global internet operations. ISPs use “Autonomous Systems,” also known as ASes, to keep track of what IP addresses are on which networks, and route packets of data between them. They use the Border Gateway Protocol (BGP) to establish and communicate routes. For example, packets can route between networks A and B, but network A can also route packets to network C through network B, and so on. This is how internet service providers interoperate to let you browse the whole internet, not just the IP addresses on their own networks.

In a “route leak,” an AS, or multiple ASes, issue incorrect information about the IP addresses on their network, which causes inefficient routing and failures for both the originating ISP and other ISPs trying to route traffic through. Think of it like a series of street signs that help keep traffic flowing in the right directions. If some of them are mislabeled or point the wrong way, assorted chaos can ensue.

Route leaks can be malicious, sometimes called “route hijacks” or “BGP hijacks,” but Monday’s incident seems to have been caused by a simple mistake that ballooned to have national impact. Large outages caused by accidental route leaks have cropped up before.

“Folks are looking to tweak routing policies, and make mistakes,” Arbor Networks’ Dobbins says. The problem could have come as CenturyLink works to integrate the Level 3 network or could have stemmed from typical traffic engineering and efficiency work.

Internet outages of all sizes caused by route leaks have occurred occasionally, but consistently, for decades. ISPs attempt to minimize them using “route filters” that check the IP routes their peers and customers intend to use to send and receive packets and attempt to catch any problematic plans. But these filters are difficult to maintain on the scale of the modern internet and can have their own mistakes.

Monday’s outages reinforce how precarious connectivity really is, and how certain aspects of the internet’s architecture—offering flexibility and ease-of-use—can introduce instability into what has become a vital service.

WCW III: World Chip War III

After something of a long hiatus, we have an emerging epic World Chip War Three, which is being fought over “CODECS,” and related chips which power our smartphones. Not that the semiconductor industry hasn’t been innovating and evolving, but this is something much bigger. Today’s news about Broadcom’s bid for Qualcomm omits the other crucial player in this new War of Titans, Intel, which has risen from earlier ignominious failures to become the third player in WCW III.


 Intel: The Missing Piece In The Epic New Global Microchip Battle

In the beginning, in the early 1970’s there were the original semiconductor companies like Intel, AMD, Motorola, and not far behind, the Japanese giants NEC, Fujitsu, and Mitsubishi. The first great Chip War was in memory chips, primarily as replacements for magnetic core memory and for the emerging new minicomputer industry. The Japanese fought World Chip War One as a nation, using the power and influence of its entire government to compete against the American companies. At the behest of the U.S. government itself, IBM bought a minority share in Intel to potentially defend Intel against any hostile bid from the Japanese.  Not long afterward, the Great Microprocessor War, World Chip War Two exploded, primarily between Intel and Motorola. Intel was the victor of World Chip War Two, primarily due to the extraordinary marketing genius of Intel Marketing VP Bill Davidow’s “Crush” campaign, not superior Intel technology. It was a huge lesson of the importance of marketing over having the “coolest technology.”  Now after something of a long hiatus, we have World Chip War Three, which is being fought over “CODECS,” and related chips which power our smartphones. Today’s news about Broadcom’s bid for Qualcomm omits the other crucial player in this new War of Titans, Intel, which has risen from earlier ignominious failures to become the third player in WCW III.

Broadcom’s Bid For Qualcomm Marks Upheaval in Chip Industry

The California-based chip maker offered made an unsolicited $105 billion takeover bid for Qualcomm

Broadcom proposed to acquire rival chip maker Qualcomm for $70 per share.
Broadcom proposed to acquire rival chip maker Qualcomm for $70 per share. PHOTO: MIKE BLAKE/REUTERS

Broadcom Ltd. AVGO 1.42% made an unsolicited $105 billion takeover bid for QualcommInc., QCOM 1.15% the chip industry’s boldest bet yet that size will equal strength at a time of technological upheaval.

The approach, which would mark the biggest technology takeover ever, shows how tech companies are positioning themselves for a world where a range of chip-driven devices—from phones to cars to factory robots—are transmitting, receiving and processing evermore information. Broadcom Chief Executive Hock Tan already has used acquisitions to build the company into the fourth-biggest chip maker by market value, part of a wave of industry consolidation as profits on some chips, such as those used in personal computers, are squeezed by sluggish sales and rising costs.

A combination with Qualcomm would create a behemoth whose chips manage communications among consumer devices and appliances, phone service providers, and data centers that are becoming the workhorses in artificial intelligence.

The deal is far from certain. San Diego-based Qualcomm, which said it would consider the proposal, is expected ultimately to rebuff it on the grounds that the price isn’t high enough, especially given the significant risk that regulators would block it, according to some analysts. Under typical circumstances, unfriendly bids like this are difficult to pull off; given the sheer size and complexity of Qualcomm, this one could be especially challenging, analysts said Monday.

Broadcom’s preference is to strike a friendly deal, but if it fails to do so, it would consider nominating Qualcomm directors who may be more amenable to a transaction, a person familiar with the matter said. The nomination deadline is Dec. 8 and the annual meeting at which the director vote would take place is likely be around March.

Broadcom offered $70 a share for Qualcomm, representing a 28% premium from its closing price on Thursday—before news reports on the expected approach.

Qualcomm shares ended trading Monday up 1.2% to $62.52, while Broadcom shares were 1.4% higher at $277.52.

Mr. Tan said he has been talking with Qualcomm for over a year about a possible tie-up. “Our strategy has been consistent,” Mr. Tan said in an interview. “When a business is No. 1 in technology and No. 1 in market position, we acquire it and put it on our Broadcom platform and grow through that strategy. Qualcomm has a very large sustainable franchise that meets those criteria.”

Should the deal be completed, Broadcom would take on Qualcomm’s leadership in developing the next wave of cellular technology, known as 5G, which is expected to roll out over the coming two years. That could give Broadcom a new growth engine, as 5G is expected to dramatically accelerate the speed and responsiveness of cellular communications necessary for applications like self-driving cars.

Broadcom was formed when Avago Technologies Ltd. bought the former Broadcom in 2015 for $39 billion and kept the name, and Mr. Tan has continued growing by acquisition. The company sells a diverse line of equipment for networking and communications. Its products include chips for Wi-Fi and Bluetooth technology that connect devices that are closer together—technologies that some analysts say are likely to grow less quickly than 5G.

“People will continue to use short-proximity wireless like Wi-Fi and Bluetooth, but the growth and money is clearly in 5G,” said analyst Patrick Moorhead of Moor Insights & Strategy.

Overall, Broadcom and Qualcomm have largely complementary product lines. But the possible Broadcom takeover is likely to face intense regulatory scrutiny, given the companies’ combined scale and the fact that they are both leaders in Wi-Fi and Bluetooth technology. The companies share customers including Apple Inc., whose iPhones and iPads include components from both Qualcomm and Broadcom.

Qualcomm already has been under pressure from antitrust agencies in several jurisdictions, including the U.S. The company has paid hefty regulatory fines in China, South Korea and Taiwan.

Qualcomm was riding high as recently as a year ago after unveiling the chip industry’s largest-ever acquisition: a $39 billion proposed deal for NXP Semiconductors NV. The deal hasn’t closed yet, and Broadcom said Monday that its proposal would stand regardless of whether Qualcomm’s proposed acquisition of NXP is consummated under the current terms.

Since then, a string of hits by regulators, competitors, and customers including Apple has left the industry titan in a vulnerable position. Qualcomm’s profit in the fiscal year that ended Sept. 24 plummeted 57%, and its share price declined 18% in the 12 months through Thursday’s close compared with a 58% rise in the PHLX Semiconductor Sector Index. That was before news of Broadcom’s interest sent Qualcomm shares up nearly 13% on Friday.

Funding for the deal would come in the form of loans from a gaggle of banks, with additional cash from Silver Lake Management LLC. The private-equity firm, which already owns a stake in Broadcom, provided a commitment letter for $5 billion in convertible debt. Silver Lake said a substantial portion of that capital would come in the form of an equity investment from its Silver Lake Partners fund, with the remainder from other sources.

The equity contribution would be the single largest in the history of the firm, exceeding the roughly $1 billion it invested in the merger of Dell Inc. and EMC Corp.

Broadcom’s bid came days after the Singapore-based company announced plans to relocate its headquarters to the U.S., a move that could make it easier to pursue acquisitions of U.S. targets.

Broadcom’s earlier $5.5 billion offer to buy Brocade Communication Systems, based in San Jose, Calif., has been delayed due to a review by the Committee on Foreign Investment in the United States, which reviews international deals that raise concerns about national security.

Any deal to acquire Qualcomm would also receive close scrutiny, experts say. “Anything that has the word semiconductor in it gets rapt attention from CFIUS,” said James Lewis of the Center for Strategic and International Studies, a policy think tank. “The move to the U.S. is an effort to tamp down CFIUS concerns.”

Silicon Valley Is Suffering From A Lack of Humanity

The genius of Steve Jobs lies in his hippie period and with his time at Reed College, the pre-eminent Liberal Arts college in North America. To his understanding of technology, Jobs brought an immersion in popular culture. In his 20s, he dated Joan Baez; Ella Fitzgerald sang at his 30th birthday party. His worldview was shaped by the ’60s counterculture in the San Francisco Bay Area, where he had grown up, the adopted son of a Silicon Valley machinist. When he graduated from high school in Cupertino in 1972, he said, “the very strong scent of the 1960s was still there. After dropping out of Reed College, a stronghold of liberal thought in Portland, Ore., in 1972, Mr. Jobs led a countercultural lifestyle himself. He told a reporter that taking LSD was one of the two or three most important things he had done in his life. He said there were things about him that people who had not tried psychedelics — even people who knew him well, including his wife — could never understand.


Deep Down We All Know Silicon Valley Needs The Humanitarian Vision of Steve Jobs

The genius of Steve Jobs lies in his hippie period and with his time at Reed College. With the deep ethical problems facing technology now, we need Jobs vision more than ever.

To his understanding of technology, Jobs brought an immersion in popular culture. In his 20s, he dated Joan Baez; Ella Fitzgerald sang at his 30th birthday party. His worldview was shaped by the ’60s counterculture in the San Francisco Bay Area, where he had grown up, the adopted son of a Silicon Valley machinist. When he graduated from high school in Cupertino in 1972, he said, “the very strong scent of the 1960s was still there. After dropping out of Reed College, a stronghold of liberal thought in Portland, Ore., in 1972, Mr. Jobs led a countercultural lifestyle himself. He told a reporter that taking LSD was one of the two or three most important things he had done in his life. He said there were things about him that people who had not tried psychedelics — even people who knew him well, including his wife — could never understand.

Decades later Jobs flew around the world in his own corporate jet, but he maintained emotional ties to the period in which he grew up. He often felt like an outsider in the corporate world, he said. When discussing Silicon Valley’s lasting contributions to humanity, he mentioned in the same breath the invention of the microchip and “The Whole Earth Catalog,” a 1960s counterculture publication. Jobs’ experience rings with my own experience in the Santa Clara Valley at that time. Jobs and I were both deeply affected by Stewart Brand, the visionary behind The Whole Earth Catalog.  Stanford professor Fred Turner has documented this period in his book “From the Counterculture to Cyberculture, Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. 

For me this journey also began with the extraordinary vision of Marshall McLuhan, the Canadian professor of communications, who literally predicted the emergence of the World Wide Web and “The Global Village,”  like some kind of modern day Nostradamus.

Stewart Brand is also featured in Tom Wolfe‘s book, “The Electric Kool-Aid Acid Test,” along with Ken Kesey’s Merry Pranksters and The Grateful Dead.  I had the great good fortune to formally meet Brand at a COMDEX Microsoft event in a hangar at McCarren Airport in Las Vegas and was immediately impressed by him, as was Jobs. Not surprisingly, Brand was an invited guest at the Microsoft event, having already seized on the importance of the personal computer and the prospect of a networked World. Recently, in another anecdote on that time, Tim Bajarin shared a wonderful story about Job’s counterculture friend and organic gardener who remains the manager of the landscape at the new Apple campus, retaining the feeling of the original Santa Clara Valley orchard economy, that some of us can still remember.

It is important to think back to that time in the Bay Area and the euphoria of the vision of “digital utopianism.”   It grounds me and helps me to understand where we have gone so terribly wrong.

Digital utopianism is now dead. I have written about its sad demise on this blog. The wonderful vision of digital utopianism and the Web has been perverted by numerous authoritarian governments, now including our own, resulting in a Balkanized Web and a dark Web pandering all kinds of evil. This is the problem we face and the urgent need for greater emphasis on ethics. What about human life, culture, and values?  So many areas of technology are on the verge of deep philosophical questions.  Uber has become the poster child for everything that is wrong with Silicon Valley. I ask myself, “What would Steve Jobs have said about Travis Kalanick and Uber?” I think we know the answer. Ironically, Silicon Valley has a center for research and study in ethics, the Markkula Center for Applied Ethics at Santa Clara University. Mike Markkula was an Intel marketing guy who quit Intel to join with two crazy long-haired guys in Cupertino.

I am a Liberal Arts & Humanities graduate myself, including graduate study at Oxford University. When I returned from England I asked the obvious question: Now how do I make a living?  As it happened, I very improbably landed my first real job at Intel Corporation. When I asked why I was hired, the answer was that I was judged to have the requisite talent and aptitude if not the technical knowledge.  I later developed a reputation for being very “technical” by the process of “osmosis,” by simply living in a highly rarified technical culture and receiving whiteboard tutorials from friendly engineers. I was thrown into a group of Ivy League MBA’s. We wistfully shared a desire to have the others’ educations, but simply working together made us all more effective. Amazingly my career grew almost exponentially and I attribute my success to that cross-fertilization.

While with Intel in Hillsboro Oregon, someone approached me to represent Intel at a talk with Reed students. I was cautioned that few if any Reed students would be interested in working for Intel, but they would be very intellectually engaging.  That proved to be a significant understatement.  In the end, I believe that perhaps two dozen “Reedies,” as they are known, joined Intel, one of whom went on to a stellar career as a Silicon Valley venture capitalist.  A significant part of my later career has been devoted to using my Humanities education background to assess and translate deep technology in human terms for the benefit of both management and potential customers.

Today, nothing of my story would ever happen, but the influence of the Humanities and Arts in business seems more sorely needed than ever.

Read more: Why We Need Liberal Arts in Technology’s Age of Distraction – Time Magazine – Tim Bajarin

Read more: Digital Utopianism of Marshall McLuhan and Stewart Brand is Cracking – mayo615,com

Read more: Liberal Arts In The Data Age – Harvard Business Review

The Importance of “Convergence” In Market and Industry Analysis


newbusinessroadtest

If You Get Technology “Convergence” Wrong, Nothing Else Matters

I came across this book during my most recent visit to the UBC Vancouver campus.  As good as I think this book is at focusing attention, in workbook style, on the importance of market and industry analysis in new venture due diligence, there is an issue that I think is not adequately addressed by any model or theory: not Porter, not STEEP or SWAT. Convergence is the issue.

We can imagine and even potentially envision a very cool business idea, but if the technology to achieve it is not ready, not sufficiently mature, the idea is Dead on Arrival (DOA).   I do not mean to pick on young entrepreneurs, but I reviewed a business concept last week that was a superb and compelling idea, but the technology necessary to achieve it simply was not there, either in terms of its capability or its price point. I am confident that it will be there in time, but it is not now.  As if to make my point, Apple announced that it was acquiring a company for $20 Million in the exact same technology area: indoor location tracking (no small feat).  At this point it is not clear that the acquired company has any extraordinary intellectual property or expertise, and the article primarily focused on the point that this “location identification” technology was “heating up.”  It looks like it may be a simple “aquihire.”   Global Positioning and geo tagging as in smart mobile phones, radio frequency identification technology (RFID), and inertial guidance are all currently used in various combinations by a host of competitors (too many) to achieve required levels of accuracy, immediacy and cost.  A local industrial RFID company has just closed its doors because it simply could not compete and make money.  The simple problem was that this company’s idea, as compelling as it was, could not achieve the necessary price point, or possibly would not even work.

So we have the problem of “convergence.”  Great idea but the technology simply is not ready….yet.

I have three personal case study examples of the problem of “convergence,” that every potential entrepreneur should study. I have to admit that I was a senior executive at all three of these Silicon Valley companies, one of which actually made it to the NASDAQ exchange.  All of them had the “convergence” problem.. Too early for the available technology.

1. Silicon Graphics.  Silicon Graphics was founded in the late 1980’s by a pre-eminent Stanford professor, Jim Clark, on the idea that 3D visualization of complex problems would become the next big wave in technology. As a minor side business, it also excelled at computer animation, a growing new market of interest to Steve Jobs and others. It is now obvious that Clark was onto something that has now finally become the Next Big Thing, but at that time, the available technology simply made it too difficult and too expensive. Silicon Graphics no longer exists. Silicon Graphics crown jewel was its enabling software code, the SGI Graphics Library. It does still exist in open form.

Read more:http://mayo615.com/2013/03/31/hans-rosling-makes-visual-sense-of-big-data-analytics/

2. iBEAM Broadcasting.  iBEAM was the precursor of YouTube, but too far ahead of its time.  the founder, Mike Bowles, a former MIT professor, envisioned streaming media across the Internet, but this was in 1999.  Intel, Fox Entertainment, Reuters, Bloomberg, Microsoft were all involved, some investing significant sums in the company. We tried mightily to make it happen for Mike, but there were technology convergence problems.  The Internet at that time simply did not have sufficient reliable broadband capability.  In 1999 the vast majority of Internet users still used a dial-up connection.  The company, with help from Microsoft and its other big pockets investors turned to satellite transmission, which is immensely expensive.  I did learn a lot about the satellite business. Great idea, way too early, and the company failed early.

3. P-Cube.  In 2001, I was approached by prominent friends at two downtown Palo Alto venture capital firms to consider joining an Israeli startup in which they had invested. The idea was wildly popular at the time….traffic policy management and so-called Internet traffic shaping.  I enthusiastically joined the new company and became its first U.S. based employee.  The compelling idea was simple, make money by charging for bandwidth. The background idea was to enable deep IP packet payload snooping to prioritize traffic, but also for its political potential. This is the technology that Dick Cheney employed after 9/11 to snoop all Internet traffic.  The only problem was that the technology was simply not yet ready.  The P-Cube Internet traffic switch was a 24 layer printed circuit board (hideously difficult to fabricate), with 5 IBM PowerPC chips, 1 Gig of onboard memory (at the time bleeding edge, but today laptops have more memory), a host of “application specific integrated circuits” (ASIC), and to top it off a proprietary software language to program the box.  In the end, P-Cube burned up $100 Million in venture capital, and I had great fun traveling the World selling it, but the box never worked, largely because the technology simply was not there..  P-Cube’s assets were bought by Cisco Systems and t0day such capability is built into the boxes of Cisco System, Juniper Networks and others.

The key takeaway lesson from this: do not underestimate the importance of technology convergence with a great idea.

New Accelerate Okanagan Report On Tech Industry: Devil Is Again In the Details

Accelerate Okanagan should be commended for publishing a document, the stated goal of which is to “assist in attracting new talent, companies, and potential investors to the Okanagan, as well to inform policy makers and the media.” Such reports are commonly used to promote a community or region’s economy. However, as with the earlier 2015 report, there are persistent issues, particularly with the industry definition and methodology of the study. The result is questionable data and numbers that simply do not pass a basic “sniff test.” Accepting the results of this study as published may only serve to mislead community leaders on planning, and mislead prospective entrepreneurs considering relocating here.


Problems Persist With New 2016 Accelerate Okanagan “Tech Industry Analysis”

aoeconomicimpact2016

 Accelerate Okanagan should be commended for publishing a document, the stated goal of which is to “assist in attracting new talent, companies, and potential investors to the Okanagan, as well to inform policy makers and the media.”  Such reports are commonly used to promote a community or region’s economy. However, as with the earlier 2015 report, there are persistent issues, particularly with the industry definition and methodology of the study.  The result is questionable data and numbers that simply do not pass a basic “sniff test.” Accepting the results of this study as published may only serve to mislead community leaders on planning, and mislead prospective entrepreneurs considering relocating here.

I taught Industry Analysis at the University of British Columbia, and my entire career has been in high-tech in Silicon Valley and globally, beginning with many years at Intel Corporation, so my assessment is exclusively from a professional perspective. A PowerPoint presentation of my work in this area is posted on this website, under the heading Professional Stuff.

The report begins by explaining that the study was completed by an unnamed third party, apparently affiliated with Small Business BC.  A review of the Small Business BC website, staff, and services indicates the organization is almost exclusively organized and resourced to provide services only to individual small businesses. For example, scanning SBBC’s “Market Research” heading, it indicates that its services are focused entirely on smaller scale research for an individual small business, not a large scale analysis of an entire industry in a region.  Industry analyses of such scale are better suited to a local educational institution like UBC, with all the requisite skills and resources.  Though I have no inside knowledge, it seems reasonable to surmise that some degree of budgetary constraint and political influence were involved in the selection of SBBC, and a desire to emphasize local promotion over objective accuracy.

With regard to methodology and industry definition, the Report states that it follows the methodology of British Columbia’s High Tech Sector Report, the most recent of which is from 2014. A closer look at this methodology can be found on the provincial government website. A separate document is listed, “Defining the British Columbia High Technology Sector Using NAICS,” published fifteen years ago in 2001. My review of this document indicates that while it offers some useful discussion, it is seriously out of date and in need of revision.  A more professional approach would have required the development of a more current methodology relevant to the Okanagan situation. The BC methodology document does provide some very cogent cautionary remarks on high-tech industry definition and methodology:

The “high technology” sector is a popular subject of discussion and analyses, partly because it is viewed as an engine of growth both in the past and for the future. However, the high-technology sector has no specific and universally accepted definition. Defining and measuring the high technology sector can be done as part of basic research at the level of individual firms. A second, more “modest” approach uses pre-existing data collected on “industries” which are defined for general statistical purposes. The challenge is to determine which of these industries warrants inclusion in the measurement of the high technology sector.

The AO Report author seems to have accepted both approaches. Page 4 of the Report explains that the author decided to also include “the previous survey undertaken by Accelerate Okanagan.”  The previous AO survey was simply a Survey Monkey survey submitted by individual local businesses. The results were apparently compiled without additional professional judgment applied, or follow-up contact with companies by phone or other means and cross-referencing with the more “modest” macro data methodology mentioned in the 2001 BC document. IMHO, if my assumptions are correct, the Survey Monkey data should have been thrown out as unreliable, or regenerated with much greater scrutiny and judgment applied.

Then there is the issue of Kelowna as an employment market, as noted in the recently reported Bank of Montreal (BMO) and BC Business low national and provincial rankings of Kelowna’s employment market. These issues have also been reported in KelownaNow.  Hootsuite, whose founder is from Vernon, consciously chose Vancouver to start his company.  CEO Ryan Holmes openly admitted that he did not base Hootsuite in the Okanagan because he knew he would not be able to attract the necessary talent here. It is also important to note that a significant number of local business and community leaders met with the BC Labour Minister and reported that their primary concern was a lack of Temporary Foreign Workers, not economic development or the growth of the local high-tech industry.

The AO Report touches on these issues only very tangentially and indirectly in the closing pages. A more credible approach would have been to confront these local problems directly, citing the BMO report for example, and what AO and the community plan to do about it.  Clearly, there are unresolved and ignored contradictions with the AO report that damage its credibility and usefulness.

Finally, this week’s media coverage of the report has died down, having duly reported all the desired sound bytes, but a Google search shows that the media coverage has so far been nearly exclusively from the local Okanagan media which does not meet the stated goal of the AO effort to broadcast the promotion beyond the Okanagan.

Read the complete AO September 2016 report here:

Click to access Economic_Impact_Study_2015_Edition.pdf

MAYO615 REPOST from January, 2015:

AO Tech Industry Report Lacks The Rigor Necessary To Give It Much Credibility

Read the AO January 2015 press release and access the full report here

The AO report’s “economic impact” conclusions are based on 2014 Survey Monkey voluntary responses, which are problematic due to an apparent lack of critical assessment. The report does not follow the kind of rigorous industry analysis performed by leading technology consultancy firms like International Data Corporation (IDC) or Gartner. The definition of an “industry,” for example the “automobile industry in Canada,” involves broad activity around all aspects of “automobiles,” but at some point, firms like Kal Tire or “Joe’s Brake Shop” might be excluded from a definition of the automobile industry.  The report does not mention the rigor applied to this industry analysis, so the question is left open, “What exactly is the “tech industry” in the Okanagan?”  A well-defined $1 Billion industry is the mobile advertising industry in Canada.  Is that what we have in the Okanagan? By way of comparison, I reported on New Zealand’s Ice House tech incubator economic impact report, which has much greater credibility.  The AO report is essentially claiming that the Okanagan technology economy is more than twice the size of New Zealand’sThat’s too big of a leap of faith for me. Read New Zealand’s Ice House Startups Achieve Impressive Results and contrast it with the AO report.

Then there is the issue of Kelowna as an employment market, as noted in the recently reported Bank of Montreal (BMO) and BC Business low national and provincial rankings of Kelowna’s employment market. These issues have also been reported in KelownaNow. Clearly, there are unresolved contradictions with the AO reports.

Read More: Kelowna’s Low Jobs Ranking

Read More: Okanagan economy likely to worsen next year

I offer a summary view of “industry analysis” here: Industry Analysis: the bigger picture

Google’s Quantum Dream May Be Just Around The Corner

In 1981, Richard Feynman, probably the most famous physicist of his time asked the question: “Can we simulate physics on a computer?” At the time the answer was “theoretically yes,” but practically not at that time. Today, we may be on the verge of answering “yes” in practice to Feynman’s original question. Quantum computers operate in such a strange way and are so radically different from today’s computers that it requires some understanding of quantum mechanics and bizarre properties like “quantum entanglement.” Quantum computers are in a realm orders of magnitude beyond today’s supercomputers and their application in specific computational problems like cryptography, Big Data analysis, computational fluid dynamics (CFD), and sub-atomic physics will change our World. Canadian quantum computing company, D-Wave Systems has been at the center of Google’s efforts to pioneer this technology.


In 1981, Richard Feynman, probably the most famous physicist of his time asked the question: “Can we simulate physics on a computer?” At the time the answer was “theoretically yes,” but practically not at that time. Today, we may be on the verge of answering “yes” in practice to Feynman’s original question. Quantum computers operate in such a strange way and are so radically different from today’s computers that it requires some understanding of quantum mechanics and bizarre properties like “quantum entanglement.” Quantum computers are in a realm orders of magnitude beyond today’s supercomputers. Their application in specific computational problems like cryptography, Big Data analysis, computational fluid dynamics (CFD), and sub-atomic physics will change our World. Canadian quantum computing company, D-Wave Systems has been at the center of Google’s efforts to pioneer this technology.

Reblogged from New Scientist

Google’s Quantum Dream May Be Just Around the Corner

 QUANTUM-articleLarge-v2

31 August 2016

Revealed: Google’s plan for quantum computer supremacy

The field of quantum computing is undergoing a rapid shake-up, and engineers at Google have quietly set out a plan to dominate

SOMEWHERE in California, Google is building a device that will usher in a new era for computing. It’s a quantum computer, the largest ever made, designed to prove once and for all that machines exploiting exotic physics can outperform the world’s top supercomputers.

And New Scientist has learned it could be ready sooner than anyone expected – perhaps even by the end of next year.

The quantum computing revolution has been a long time coming. In the 1980s, theorists realised that a computer based on quantum mechanics had the potential to vastly outperform ordinary, or classical, computers at certain tasks. But building one was another matter. Only recently has a quantum computer that can beat a classical one gone from a lab curiosity to something that could actually happen. Google wants to create the first.

The firm’s plans are secretive, and Google declined to comment for this article. But researchers contacted by New Scientist all believe it is on the cusp of a breakthrough, following presentations at conferences and private meetings.

“They are definitely the world leaders now, there is no doubt about it,” says Simon Devitt at the RIKEN Center for Emergent Matter Science in Japan. “It’s Google’s to lose. If Google’s not the group that does it, then something has gone wrong.”

We have had a glimpse of Google’s intentions. Last month, its engineers quietly published a paper detailing their plans (arxiv.org/abs/1608.00263). Their goal, audaciously named quantum supremacy, is to build the first quantum computer capable of performing a task no classical computer can.

“It’s a blueprint for what they’re planning to do in the next couple of years,” says Scott Aaronson at the University of Texas at Austin, who has discussed the plans with the team.

So how will they do it? Quantum computers process data as quantum bits, or qubits. Unlike classical bits, these can store a mixture of both 0 and 1 at the same time, thanks to the principle of quantum superposition. It’s this potential that gives quantum computers the edge at certain problems, like factoring large numbers. But ordinary computers are also pretty good at such tasks. Showing quantum computers are better would require thousands of qubits, which is far beyond our current technical ability.

Instead, Google wants to claim the prize with just 50 qubits. That’s still an ambitious goal – publicly, they have only announced a 9-qubit computer – but one within reach.

“It’s Google’s to lose. If Google’s not the group that does it, then something has gone wrong“

To help it succeed, Google has brought the fight to quantum’s home turf. It is focusing on a problem that is fiendishly difficult for ordinary computers but that a quantum computer will do naturally: simulating the behaviour of a random arrangement of quantum circuits.

Any small variation in the input into those quantum circuits can produce a massively different output, so it’s difficult for the classical computer to cheat with approximations to simplify the problem. “They’re doing a quantum version of chaos,” says Devitt. “The output is essentially random, so you have to compute everything.”

To push classical computing to the limit, Google turned to Edison, one of the most advanced supercomputers in the world, housed at the US National Energy Research Scientific Computing Center. Google had it simulate the behaviour of quantum circuits on increasingly larger grids of qubits, up to a 6 × 7 grid of 42 qubits.

This computation is difficult because as the grid size increases, the amount of memory needed to store everything balloons rapidly. A 6 × 4 grid needed just 268 megabytes, less than found in your average smartphone. The 6 × 7 grid demanded 70 terabytes, roughly 10,000 times that of a high-end PC.

Google stopped there because going to the next size up is currently impossible: a 48-qubit grid would require 2.252 petabytes of memory, almost double that of the top supercomputer in the world. If Google can solve the problem with a 50-qubit quantum computer, it will have beaten every other computer in existence.

Eyes on the prize

By setting out this clear test, Google hopes to avoid the problems that have plagued previous claims of quantum computers outperforming ordinary ones – including some made by Google.

Last year, the firm announced it had solved certain problems 100 million times faster than a classical computer by using a D-Wave quantum computer, a commercially available device with a controversial history. Experts immediately dismissed the results, saying they weren’t a fair comparison.

Google purchased its D-Wave computer in 2013 to figure out whether it could be used to improve search results and artificial intelligence. The following year, the firm hired John Martinis at the University of California, Santa Barbara, to design its own superconducting qubits. “His qubits are way higher quality,” says Aaronson.

It’s Martinis and colleagues who are now attempting to achieve quantum supremacy with 50 qubits, and many believe they will get there soon. “I think this is achievable within two or three years,” says Matthias Troyer at the Swiss Federal Institute of Technology in Zurich. “They’ve showed concrete steps on how they will do it.”

Martinis and colleagues have discussed a number of timelines for reaching this milestone, says Devitt. The earliest is by the end of this year, but that is unlikely. “I’m going to be optimistic and say maybe at the end of next year,” he says. “If they get it done even within the next five years, that will be a tremendous leap forward.”

The first successful quantum supremacy experiment won’t give us computers capable of solving any problem imaginable – based on current theory, those will need to be much larger machines. But having a working, small computer could drive innovation, or augment existing computers, making it the start of a new era.

Aaronson compares it to the first self-sustaining nuclear reaction, achieved by the Manhattan project in Chicago in 1942. “It might be a thing that causes people to say, if we want a full-scalable quantum computer, let’s talk numbers: how many billions of dollars?” he says.

Solving the challenges of building a 50-qubit device will prepare Google to construct something bigger. “It’s absolutely progress to building a fully scalable machine,” says Ian Walmsley at the University of Oxford.

For quantum computers to be truly useful in the long run, we will also need robust quantum error correction, a technique to mitigate the fragility of quantum states. Martinis and others are already working on this, but it will take longer than achieving quantum supremacy.

Still, achieving supremacy won’t be dismissed.

“Once a system hits quantum supremacy and is showing clear scale-up behaviour, it will be a flare in the sky to the private sector,” says Devitt. “It’s ready to move out of the labs.”

“The field is moving much faster than expected,” says Troyer. “It’s time to move quantum computing from science to engineering and really build devices.”

Canada Glaringly Absent From World’s 10 Most Innovative Countries

The following infographic provides an excellent overview of the World’s Most Innovative Countries and the weighted criteria used to rank the top 10. Glaringly, Canada is completely absent from this list. It is worth noting that eight of the ten countries listed have much smaller populations than Canada. That said, I have little essential disagreement with this list. Investment in research & development, leading to commercial technology innovation is crucial to a country’s economic growth and competitiveness in productivity. Canada lags in every category.


The following infographic provides an excellent overview of the World’s Most Innovative Countries and the weighted criteria used to rank the top 10.  Glaringly, Canada is completely absent from this list.  It is worth noting that eight of the ten countries listed have much smaller populations than Canada.  Published by The Times of London, the list is not a perfect. I find it a bit busy, and it does not include consideration of the OECD data on investment in research & development in the leading industrialized countries. That said, I have little essential disagreement with this list.  I also believe that Dan Muzyka and the Conference Board of Canada would not disagree with this assessment. Investment in research & development, leading to commercial technology innovation is crucial to a country’s economic growth and competitiveness in productivity. Canada lags in every category.

READ MORE: World’s Most Innovative Countries: The Times of London


innovative countries top 10

Partnerships, Collaboration and Co-opetition: More Important Than Ever

In the simplest terms, the concept here is how a company can potentially increase both revenue and market share by executing a strategy to work with direct or indirect competitor(s) to the benefit of both, a win-win. The old Arab saying, “My enemy’s enemy is my friend” also applies. It can also be as simple as joining an ad hoc collaboration among a group of companies or a standards group to create market order and simplicity from an overcrowded and confused market. Customers invariably respond to products that provide the greatest value and paths to long-term increased value and cost reduction. Collaboration or “Co-opetition” is one of the most effective means to achieve that goal, particularly in an economic environment where “flat is the new up.”


A Strategy For Survival in Tough Times

In the simplest terms, the concept here is how a company can potentially increase both revenue and market share by executing a strategy to work with its direct or indirect competitor(s) to the benefit of both, a win-win.  The old Arab saying, “My enemy’s enemy is my friend” also applies. It can also be as simple as joining an ad hoc collaboration among a group of companies or a standards group to create market order and simplicity from an overcrowded and confused market.  Customers invariably respond to products that provide the greatest value and paths to long-term increased value and cost reduction. Collaboration or “Co-opetition” is one of the most effective means to achieve that goal, particularly in an economic environment where “flat is the new up.”

Multibus: An Early Example of Collaboration Building A New Market

Soon after joining Intel, I learned about Intel’s concept of “Open Systems” and its “Multibus” system architecture.  Motorola was Intel’s primary competitor in microprocessors and so-called “single board computers” at that time.  Intel’s now legendary Marketing VP, Bill Davidow had developed a strategy to recruit other companies to support Multibus as an open system standard.  Davidow’s idea was to make Multibus more attractive to system designers by having a stable of compatible products from other companies supporting Multibus. It worked. Since that time the concept has evolved significantly and has played a major role in the development of many new markets. This post discusses some of the evolutionary changes, offers two high-tech case studies and some key requirements for successful collaboration.  It is more important now than ever as a survival strategy in a particularly challenging global economy.

The IBM Personal Computer Sets The Standard For The Future

Perhaps the best known high-tech example of an open system is the IBM Personal Computer, involving IBM, Intel, Microsoft, and thousands of other supporting companies. The result has been the creation of a huge new market, with over 400,000 applications for the PC, significant price competition, and interchangeable components from multiple vendors.  By contrast, Apple opted for a closed, proprietary system, which persists to this day, and continues to be a source of discontent from Apple customers: higher prices, as well as accessories and interfaces only available from Apple, etc. In sheer market share, the PC dominated at 85% of the total market, while Apple was forced to concentrate on niche markets like education and graphic design. I am not going to discuss the PC as it has been analyzed extensively over the years, though it does provide an excellent case study on the dynamics and market power of open systems versus closed proprietary systems.

 Important Current Co-opetition Successes: DSL And Android

I will discuss two other cases, one less well known and the other better known and more recent.  In the first case, I was personally involved so my experience enables me to speak in-depth on the topic.  Shortly after leaving Ascend Communications, I was called by a friend at Compaq/HP in Houston and asked to fly down to Houston for a private discussion with the VP of the Presario Division and his team.  The VP wanted to incorporate a high-speed digital subscriber line (DSL) connection in the Presario out of the box.  The idea was that a consumer would connect the PC to a standard RJ11 telephone wall jack, and be instantly connected to the Internet.  However, I had to explain that the challenges to this were enormous. First and foremost the telephone companies themselves could not agree on the standard for how DSL worked. Equally problematic, the DSL market was fragmented with dozens of competitors offering different proprietary solutions.

We decided to proceed regardless, recognizing that if HP/Compaq were to succeed with their ingenious idea, it would require a fundamental change in the current DSL market and the telcos.  This could only be attempted if Compaq joined forces with Intel and Microsoft, and even then the outcome would be uncertain.  I contacted Ali Sarabi in Intel’s Architecture Labs, who admitted that Intel had been thinking of the same idea, and talking with Microsoft as well. So within two weeks all three companies met at Microsoft in Bellevue and the idea gained steam. Soon after we held three days of secret meetings in Atlanta with DSL companies, without explaining our purpose, and came away completely dejected. Bringing the competitors together was hopeless. They all pointed in a different direction. It then dawned on us that if we could get the telecom companies to agree on a single DSL standard, they could unite and as “the customers,” and therefore dictate to the DSL competitors what they would buy. Nothing works better than the opportunity to make money.

Another round of secret meetings in Seattle with the telecoms, and follow-up meetings around the country led to a breakthrough: the formation of a global consortium of over 100 telecom companies and DSL companies that culminated in the International Telecommunications Union in Geneva Switzerland creating a single global DSL standard, which eventually made the original Compaq Presario vision a reality.

Special Interest Group Legal Framework Paves The Way

One of the keys to this success was a simple legal framework for the companies to collaborate, known now as a “Special Interest Group,” avoiding any hint of unfair competition and ensuring that the technical aspects of the standard would be in the public domain. The SIG legal document has since been used in a number of other developments, notably Bluetooth and USB.  Other standards bodies, like the IEEE and IETF, are also structured similarly, enabling the creation of crucial collaborative projects like WiFi. These efforts are now a key aspect of many high-tech markets. Many companies devote entire teams to managing their participation in these standards bodies and ad hoc industry collaboration activities. Even on a small scale, some agreed framework, a Memorandum of Understanding or a simple one-pager may be required to achieve the necessary trust to move forward.

Android Repeats The IBM PC Phenomenon

The second case of successful global industry-wide collaboration is the Google Android smartphone operating system versus Apple IOS.  Once again, Android is an open architecture while Apple IOS is a closed proprietary system. Android has been adopted by a wide range of smartphone manufacturers, most notably Samsung, HTC, and Huawei. Despite the well-publicised popularity of Apple’s iPhone, the fact remains that Android, as an open architecture dominates the global smartphone market at 82% market share in 2015, as reported by International Data Corporation (IDC), and Apple again stuck in the 15% range.

smartphone-os-market-share

Global Smartphone Market Share 2015 (IDC)

Two Failures To Collaborate: Videoconferencing And The Internet of Things

The video conferencing market has been around for nearly thirty years. Originally, there were big bulky proprietary systems. Cisco Systems later became a major player with its own impressive HD technology. In all, there were nearly a dozen major competitors addressing an “enterprise market” for business use only. The equipment was very expensive. Then along came Skype, WebEx, Apple Facetime and others. The problem is that, after thirty years, none of these competitors applications can talk with any other application. Clearly, this is a problem. So “middleware” startups have sprung up, offering a simple translation of otherwise incompatible video transmission protocols. Bluejeans technology is one excellent example. I have used it personally in my UBC classes to link a guest lecture on Skype to UBC’s corporate video conferencing system because there is no other way to do it. Is this the best solution or cost-effective. Absolutely not. Why, after thirty years, has the video conferencing industry failed to standardize?

In another case, the emerging new market buzzword is “The Internet of Things.” This means that everything in your home can and will be connected to the Internet. Sounds simple enough, right?  Not exactly.  Today the IoT market remains a complex, confusing Tower of Babble, with multiple competing communications protocols. Some products support WiFi, but there is no one single agreed way to communicate. A recent ZDNet post explains that home automation currently requires that devices need to be able to connect with “multiple local- and wide-area connectivity options (ZigBee, Wi-Fi, Bluetooth, GSM/GPRS, RFID/NFC, GPS, Ethernet). Along with the ability to connect many different kinds of sensors, this allows devices to be configured for a range of vertical markets.” Huh?  This is the problem in a nutshell. You do not need to be a data communication engineer to get the point.   I have written here on this blog about this embarrassing failure to collaborate.

Summary

While the open architecture of the PC happened more or less organically, as so many companies were keen to get in on the action, the DSL problem was a hairball of enormous global complexity that had to be solved.  I am honored to have been part of that effort. Google’s decision to launch Android as an open architecture was more like Multibus, and the conscious strategic decision of Eric Schmidt and Larry Page to enter the market as an open system from the outset. Other examples in other industries abound and are documented in the now legendary book, Co-opetition.

co-opetition1

The result in all three successful cases has been a dramatic market success. The key takeaway point is that in all three cases the open architecture created opportunity and expanded the market.  Industry collaborations like this are as relevant for smaller markets with only two or three competitors as for large complex markets.  Collaboration can be the key to company survival or failure.

Canadian Unicorn Hootsuite Valuation Written Down By Fidelity Investments


Talk on the street suggests that Hootsuite’s problems are not all related to the downturn in the larger venture capital and private investment markets. There has been criticism of HootSuite’s newest Dashboard iteration, the Hootsuite software design and development process in general, and rumors of stagnant revenue growth as competition has entered the market.  In  addition, there has been criticism of Holme’s personal leadership at Hootsuite, suggesting that he has been spending too much time on “cardboard desktops” and land deals, as the company’s problems have mounted.

In a related development which may suggest the further contraction of investor interest in startups with very large valuations, The Wall Street Journal today reported that many Wall Street mutual funds were reducing their exposure to startup investments. Historically, mutual funds have invested in high-risk startups only via well-known, reputable venture capital firms, who solict the mutual funds managers. However, recently,  and in the cases of some startups like Uber and Hootsuite, the mutual funds have taken direct investment positions. This appears to be ending, and venture capital firms may be hard pressed to attract to their own VC funds:

READ MORE: Mutual funds sour on startup investments

Canadian tech unicorn Hootsuite gets written down by Fidelity

Fidelity Investments cut the value of its stake in Hootsuite Media Inc., one of Canada’s most highly valued technology startups, in a sign that lowered U.S. investor expectations are making their way north of the border. The Boston asset manager wrote down its investment in Hootsuite, maker of social media marketing software, by 18 per cent.

Fidelity was the lead investor when Hootsuite raised $60-million in 2014. That financing round valued the Vancouver company at $1-billion, according to research firm CB Insights. Hootsuite is one of only two Canadian unicorns, the researcher said. The other is messaging app developer Kik Interactive Inc.

As startup financing begins to slow, investors have been reevaluating some of their portfolios. Like other fund managers, Fidelity periodically readjusts the value of its private stock holdings, based on a variety of factors, and is required to disclose the data publicly. The 18 per cent writedown of Hootsuite, from June to December, was disclosed in public filings.

Fidelity marked down its stakes in several corporate software startups in January, but it maintained high expectations for some social networking companies, including Pinterest Inc. and Snapchat Inc. With the writedown of Hootsuite, Fidelity values its holding below what it originally paid. Hootsuite didn’t immediately have a comment.

Last year, Hootsuite Chief Executive Officer Ryan Holmes said that an initial public offering was eventually in the cards but that he was focused at the time on increasing the company’s cash flow. Hootsuite hired a chief financial officer in October. Then it cut some employees in December.

Hootsuite said in October that more than 10 million people use its social media platform to help organize advertising campaigns, interact with customers or streamline their social media presence. The company also provides tools for businesses to produce content for their employees to share on their personal accounts. Hootsuite has said it’s raised at least $250-million since it was founded in 2008.

Raghwa Gopal Named New Accelerate Okanagan CEO. Can He Turn Things Around?

Well-known local entrepreneur and community activist, Raghwa Gopal has been named the new CEO of Accelerate Okanagan with much fanfare. My sincere wishes for his success in this important new role in the community. However, it is extremely important to also recognize the major challenges he faces. Just this week BMO issued a report which ranked Kelowna the worst job market in Canada, well behind many seemingly more distressed Ontario communities. The reasons for Kelowna’s economic problems are deep and long-standing.


Well-known local entrepreneur and community activist, Raghwa Gopal has been named the new CEO of Accelerate Okanagan with much fanfare.  My sincere wishes for his success in this important new role in the community.  However, it is extremely important to also recognize the major challenges he faces.  Just this week BMO issued a report which ranked Kelowna the worst job market in Canada, well behind many seemingly more distressed Ontario communities.

The reasons for Kelowna’s economic problems are deep and long-standing. Accelerate Okanagan was hailed years ago for its potential value in boosting the local economy. Unfortunately, despite support and large funding infusions from the BC Innovation Council, not much has happened over these years.  The small handful of companies that can be listed as having done well enough to survive or to be sold, have had virtually zero impact on the economy. One such company was sold to a Silicon Valley networking company for about $20 Million. Another was sold to Telus Health for an undisclosed amount.  This is usually referred to in Silicon Valley as “parking,” or salvaging whatever is possible from a startup that did not do well. The other examples of Okanagan success, Club Penguin and recently, Immersive Media, are prime examples of how Canadian companies are bought for a song, and then stripped of their intellectual property (IP), and eventually the jobs as well. In the case of Disney and Club Penguin, I know a bit of the background.  A few years earlier, I had been invited, under NDA, to see Disney’s big budget online project development, which had spent hundreds of millions without much to show for it. Club Penguin was dirt cheap in Disney’s world, compared to their past losses.  Hootsuite is the one successful company whose founder is from Vernon.  But CEO Ryan Holmes has openly admitted that he did not base Hootsuite in the Okanagan because he knew he would not be able to attract the necessary talent here.

READ MORE: 

Kelowna one of the toughest cities to find a job

More disturbing, the local Okanagan establishment seems lost in a delusion regarding the size and impact of its high-tech industry.  Accelerate Okanagan recently published a report claiming that the high-tech industry here is valued at more than $1 Billion, which has been repeatedly cited by local leaders, including Kelowna Mayor Colin Basran. The fact is that no reputable industry analyst could honestly agree with the AO assessment, as the report was little more than an unscrutinized survey, lacking the most basic rigor of true industry analysis.  Add to that, the simplest comparison with another Canadian $1 Billion industry, mobile phone advertising, for example, does not square with what we see in Kelowna.

Some time ago, I reported on New Zealand’s Ice House tech incubator economic impact report, which has much greater credibility.  The AO report is essentially claiming that the Okanagan technology economy is more than twice the size of New Zealand’s…That’s too big of a leap of faith for me. Read New Zealand’s Ice House Startups Achieve Impressive Results and contrast it with the AO report.

So I offer my best wishes to Raghwa in his new position, and sincerely hope that he will be able to cut through the serious impediments to economic development and jobs growth in the Okanagan, particularly the need for a more realistic assessment of the current situation.

READ MORE: 

Can Accelerate Okanagan's Report On Local Tech Industry Economic Impact Be Believed?

READ MORE: 

http://mayo615.com/2014/12/19/okanagan-economy-and-jobs-market-likely-to-worsen-next-year/

download

Accelerate Okanagan names Raghwa Gopal as CEO

GopalThe Accelerate Okanagan Technology Association has named Raghwa Gopal, a veteran of the Kelowna technology community, as its new CEO.

Gopal had been acting CEO for the past two months. This new announcement simply cements him in to the full-time CEO role.

Over his 28-year career, Gopal co-founded Vadim Software, an asset management platform used by the Canadian government among other provincial and municipal clients, a company which eventually grew to generate $25 million in annual revenue and employed over 100 people before being acquired in 2001.

Gopal retired as president and chief technology officer of Vadim Software in 2006.

“It is an immense honor to be offered the position of CEO for Accelerate Okanagan, particularly because this is such an exciting time for the tech industry in the Okanagan and the province as a whole,” said Mr. Gopal.  “With the Okanagan Centre for Innovation (OCI) opening soon, the new BC Tech Fund, and new and innovative programs being offered by Accelerate Okanagan, I see tremendous opportunity for the growth of tech companies in the Okanagan.”

The Okanagan Centre for Innovation is a six-storey, 104,000 square foot facility under construction at the corner of Doyle and Ellis streets in Kelowna.

“After an exhaustive search that involved over 120 candidates, we are extremely pleased to announce Raghwa Gopal as AO’s new CEO,” said Accelerate Okanagan Board Chairman Blair Forrest. “Mr. Gopal was by far the best candidate measured against the core competencies established by our CEO Search Committee and we are very fortunate to have someone of his calibre to lead our organization through the next stage of growth.”

In his “retirement”, Gopal has been involved in a number of volunteer roles, including Director of the Okanagan College Foundation, the Rotary Club of Kelowna, the United Way, and the Central Okanagan Development Commission.

“He is a very well-known and respected person with an extensive history in our community who will bring many years of business acumen, industry expertise and knowledge to the role,” continued Forrest. “Through his prior involvement as acting CEO and Executive in Residence, Raghwa is very familiar with our team, association members, programs, clients, partners, government funding organizations and objectives.”

Statistics Canada last year named Kelowna B.C.’s fastest growing city, with a population growth of 1.8% over the previous year.

“One of my primary goals will be to create an ecosystem of collaboration between different stakeholders – both here in the Okanagan and province wide – to provide bigger and better opportunities for local companies to grow and thrive,” added Mr. Gopal. “I’m looking forward to help further cultivate and nurture the burgeoning tech industry in the Okanagan.”