Help Us Return Home to France to Mentor Entrepreneurs: Fundrazr Campaign đź‡«đź‡·

I want to return to France to give back my experience, skills, and technical knowledge to the country of my heritage. France’s industrial economy is in the doldrums, but new policies are stimulating innovation, the key to economic growth and productivity, and technology industry leaders in France with strong technology industry backgrounds are looking to contribute to this new economy in France. I want to join them and give back.


In less than 24 hours since our campaign launch, we are nearing 10% of our goal

 

Link to our FundRazr Campaign: Please Help Us Return to Home to France to Mentor Entrepreneurs/Startups

I am a native-born Californian with French family heritage and a French wife. We are both French citizens preparing to return to France. My university background is in the Humanities and Social Sciences, with a year of graduate study at Oxford University, researching in the Bodleian Library. When I returned to northern California, I eventually landed an entry-level job at Intel Corporation, which proved to be the crucible for my entire career. I eventually rose to be a senior executive in international business development with Intel. I have continued in international business for all of my career, working for a number of tech startups and venture capital investment firms over the years. I have led two tech industry consortia to develop global industry standards. I have been the director of a tech entrepreneurial incubator in Silicon Valley for the government of New Zealand and collaborated on mentoring promising entrepreneurs in locations here and around the world. I was an Adjunct Professor of Management at the University of British Columbia for four years.

I want to return to France to give back my experience, skills, and technical knowledge to the country of my heritage. France’s industrial economy is in the doldrums, but new policies are stimulating innovation, the key to economic growth and productivity, and technology industry leaders in France with strong technology industry backgrounds are looking to contribute to this new economy in France. I want to join them and give back.

I am now semi-retired, but very eager to return permanently to France to donate my technology industry experience and knowledge to assist French entrepreneurs to transform France into an innovation-based economy.

FundRazr Campaign Story:

We are David Mayes and Isabelle Roux-Mayes, a married couple, who are also French citizens. I am also a native Californian who has spent my career working for a number of Silicon Valley companies and investment firms, beginning with Intel Corporation. I am now semi-retired, but very eager to return permanently to France to donate my technology industry experience and knowledge to assist French entrepreneurs to transform France into an innovation-based economy. I am focusing specifically on building working relationships with three major new initiatives that could benefit from my background and achievements:    The Camp in Aix-en-Provence, launched last year, Startup Garage, Paris, and 1kubator in Bourdeaux.

I am more than happy to share my achievements and references to validate my credentials and verify my ability to make a serious contribution. You can start here with my LinkedIn profile and references David Mayes on LinkedIn.  You may also contact me here or on FundRazr where we can discuss my crowdfunding project.

Google’s Quantum Dream May Be Just Around The Corner

In 1981, Richard Feynman, probably the most famous physicist of his time asked the question: “Can we simulate physics on a computer?” At the time the answer was “theoretically yes,” but practically not at that time. Today, we may be on the verge of answering “yes” in practice to Feynman’s original question. Quantum computers operate in such a strange way and are so radically different from today’s computers that it requires some understanding of quantum mechanics and bizarre properties like “quantum entanglement.” Quantum computers are in a realm orders of magnitude beyond today’s supercomputers and their application in specific computational problems like cryptography, Big Data analysis, computational fluid dynamics (CFD), and sub-atomic physics will change our World. Canadian quantum computing company, D-Wave Systems has been at the center of Google’s efforts to pioneer this technology.


In 1981, Richard Feynman, probably the most famous physicist of his time asked the question: “Can we simulate physics on a computer?” At the time the answer was “theoretically yes,” but practically not at that time. Today, we may be on the verge of answering “yes” in practice to Feynman’s original question. Quantum computers operate in such a strange way and are so radically different from today’s computers that it requires some understanding of quantum mechanics and bizarre properties like “quantum entanglement.” Quantum computers are in a realm orders of magnitude beyond today’s supercomputers. Their application in specific computational problems like cryptography, Big Data analysis, computational fluid dynamics (CFD), and sub-atomic physics will change our World. Canadian quantum computing company, D-Wave Systems has been at the center of Google’s efforts to pioneer this technology.

Reblogged from New Scientist

Google’s Quantum Dream May Be Just Around the Corner

 QUANTUM-articleLarge-v2

31 August 2016

Revealed: Google’s plan for quantum computer supremacy

The field of quantum computing is undergoing a rapid shake-up, and engineers at Google have quietly set out a plan to dominate

SOMEWHERE in California, Google is building a device that will usher in a new era for computing. It’s a quantum computer, the largest ever made, designed to prove once and for all that machines exploiting exotic physics can outperform the world’s top supercomputers.

And New Scientist has learned it could be ready sooner than anyone expected – perhaps even by the end of next year.

The quantum computing revolution has been a long time coming. In the 1980s, theorists realised that a computer based on quantum mechanics had the potential to vastly outperform ordinary, or classical, computers at certain tasks. But building one was another matter. Only recently has a quantum computer that can beat a classical one gone from a lab curiosity to something that could actually happen. Google wants to create the first.

The firm’s plans are secretive, and Google declined to comment for this article. But researchers contacted by New Scientist all believe it is on the cusp of a breakthrough, following presentations at conferences and private meetings.

“They are definitely the world leaders now, there is no doubt about it,” says Simon Devitt at the RIKEN Center for Emergent Matter Science in Japan. “It’s Google’s to lose. If Google’s not the group that does it, then something has gone wrong.”

We have had a glimpse of Google’s intentions. Last month, its engineers quietly published a paper detailing their plans (arxiv.org/abs/1608.00263). Their goal, audaciously named quantum supremacy, is to build the first quantum computer capable of performing a task no classical computer can.

“It’s a blueprint for what they’re planning to do in the next couple of years,” says Scott Aaronson at the University of Texas at Austin, who has discussed the plans with the team.

So how will they do it? Quantum computers process data as quantum bits, or qubits. Unlike classical bits, these can store a mixture of both 0 and 1 at the same time, thanks to the principle of quantum superposition. It’s this potential that gives quantum computers the edge at certain problems, like factoring large numbers. But ordinary computers are also pretty good at such tasks. Showing quantum computers are better would require thousands of qubits, which is far beyond our current technical ability.

Instead, Google wants to claim the prize with just 50 qubits. That’s still an ambitious goal – publicly, they have only announced a 9-qubit computer – but one within reach.

“It’s Google’s to lose. If Google’s not the group that does it, then something has gone wrong“

To help it succeed, Google has brought the fight to quantum’s home turf. It is focusing on a problem that is fiendishly difficult for ordinary computers but that a quantum computer will do naturally: simulating the behaviour of a random arrangement of quantum circuits.

Any small variation in the input into those quantum circuits can produce a massively different output, so it’s difficult for the classical computer to cheat with approximations to simplify the problem. “They’re doing a quantum version of chaos,” says Devitt. “The output is essentially random, so you have to compute everything.”

To push classical computing to the limit, Google turned to Edison, one of the most advanced supercomputers in the world, housed at the US National Energy Research Scientific Computing Center. Google had it simulate the behaviour of quantum circuits on increasingly larger grids of qubits, up to a 6 Ă— 7 grid of 42 qubits.

This computation is difficult because as the grid size increases, the amount of memory needed to store everything balloons rapidly. A 6 Ă— 4 grid needed just 268 megabytes, less than found in your average smartphone. The 6 Ă— 7 grid demanded 70 terabytes, roughly 10,000 times that of a high-end PC.

Google stopped there because going to the next size up is currently impossible: a 48-qubit grid would require 2.252 petabytes of memory, almost double that of the top supercomputer in the world. If Google can solve the problem with a 50-qubit quantum computer, it will have beaten every other computer in existence.

Eyes on the prize

By setting out this clear test, Google hopes to avoid the problems that have plagued previous claims of quantum computers outperforming ordinary ones – including some made by Google.

Last year, the firm announced it had solved certain problems 100 million times faster than a classical computer by using a D-Wave quantum computer, a commercially available device with a controversial history. Experts immediately dismissed the results, saying they weren’t a fair comparison.

Google purchased its D-Wave computer in 2013 to figure out whether it could be used to improve search results and artificial intelligence. The following year, the firm hired John Martinis at the University of California, Santa Barbara, to design its own superconducting qubits. “His qubits are way higher quality,” says Aaronson.

It’s Martinis and colleagues who are now attempting to achieve quantum supremacy with 50 qubits, and many believe they will get there soon. “I think this is achievable within two or three years,” says Matthias Troyer at the Swiss Federal Institute of Technology in Zurich. “They’ve showed concrete steps on how they will do it.”

Martinis and colleagues have discussed a number of timelines for reaching this milestone, says Devitt. The earliest is by the end of this year, but that is unlikely. “I’m going to be optimistic and say maybe at the end of next year,” he says. “If they get it done even within the next five years, that will be a tremendous leap forward.”

The first successful quantum supremacy experiment won’t give us computers capable of solving any problem imaginable – based on current theory, those will need to be much larger machines. But having a working, small computer could drive innovation, or augment existing computers, making it the start of a new era.

Aaronson compares it to the first self-sustaining nuclear reaction, achieved by the Manhattan project in Chicago in 1942. “It might be a thing that causes people to say, if we want a full-scalable quantum computer, let’s talk numbers: how many billions of dollars?” he says.

Solving the challenges of building a 50-qubit device will prepare Google to construct something bigger. “It’s absolutely progress to building a fully scalable machine,” says Ian Walmsley at the University of Oxford.

For quantum computers to be truly useful in the long run, we will also need robust quantum error correction, a technique to mitigate the fragility of quantum states. Martinis and others are already working on this, but it will take longer than achieving quantum supremacy.

Still, achieving supremacy won’t be dismissed.

“Once a system hits quantum supremacy and is showing clear scale-up behaviour, it will be a flare in the sky to the private sector,” says Devitt. “It’s ready to move out of the labs.”

“The field is moving much faster than expected,” says Troyer. “It’s time to move quantum computing from science to engineering and really build devices.”

CERN Hadron Collider Again Surprises Us

I previously posted WRT the fact that we are approaching the limits of our ability to achieve physical proof of quantum physics. Why should we care? Where do we go after the CERN Hadron Super Collider confirmed the existence of the Higgs-boson particle, proving the role of dark matter? That said, two separate teams at CERN are debating the results of further experiments that suggest the possible existence of a new sub-atomic particle. This particle, if it exists, and can be confirmed, may support the existence of additional dimensions of space and time. The MIT Technology Review has also suggested that the CERN Hadron Super Collider could potentially prove the validity of the Star Trek hyperdrive technology. We should care because it is the future of the technology that will continue to change our lives.


Photo

Researchers at the Large Hadron Collider at CERN are smashing together protons to search for new particles and forces. 

Does the Higgs boson have a cousin?

Two teams of physicists working independently at the Large Hadron Collider at CERN, the European Organization for Nuclear Research,reported on Tuesday that they had seen traces of what could be a new fundamental particle of nature.

One possibility, out of a gaggle of wild and not-so-wild ideas springing to life as the day went on, is that the particle — assuming it is real — is a heavier version of the Higgs boson, a particle that explains why other particles have mass. Another is that it is a graviton, the supposed quantum carrier of gravity, whose discovery could imply the existence of extra dimensions of space-time.

At the end of a long chain of “ifs” could be a revolution, the first clues to a theory of nature that goes beyond the so-called Standard Model, which has ruled physics for the last quarter-century.

It is, however, far too soon to shout “whale ahoy,” physicists both inside and outside CERN said, noting that the history of particle physics is rife with statistical flukes and anomalies that disappeared when more data was compiled.

A coincidence is the most probable explanation for the surprising bumps in data from the collider, physicists from the experiments cautioned, saying that a lot more data was needed and would in fact soon be available.

“I don’t think there is anyone around who thinks this is conclusive,” said Kyle Cranmer, a physicist from New York University who works on one of the CERN teams, known as Atlas. “But it would be huge if true,” he said, noting that many theorists had put their other work aside to study the new result.

When all the statistical effects are taken into consideration, Dr. Cranmer said, the bump in the Atlas data had about a 1-in-93 chance of being a fluke — far stronger than the 1-in-3.5-million odds of mere chance, known as five-sigma, considered the gold standard for a discovery. That might not be enough to bother presenting in a talk except for the fact that the competing CERN team, named C.M.S., found a bump in the same place.

“What is nice is that it is not a particularly crazy signal, in a quite clean channel,” said Nima Arkani-Hamed, a particle theorist at the Institute for Advanced Study in Princeton, N.J., speaking before the announcement. “So, while we are nowhere near moving champagne even vaguely close to the fridge, it is intriguing.”

Physicists could not help wondering if history was about to repeat itself. It was four years ago this week that the same two teams’ detection of matching bumps in Large Hadron Collider data set the clock ticking for thediscovery of the Higgs boson six months later. And so the auditorium at CERN, outside Geneva, was so packed on Tuesday that some officials had to sit on the floor for a two-hour presentation about the center’s recent work that began with the entire crowd singing “Happy Birthday” to Claire Lee, one of the experimenters, from Brookhaven National Laboratory on Long Island.

At one point, Rolf Heuer, the departing director-general of CERN, tried to get people to move off the steps, declaring they were a fire hazard. When they did not move, he joked that he now knew he was a lame duck.

When physicists announced in 2012 that they had indeed discovered the Higgs boson, it was not the end of physics. It was not even, to paraphrase Winston Churchill, the beginning of the end.

It might, they hoped, be the end of the beginning.

The Higgs boson was the last missing piece of the Standard Model, which explains all we know about subatomic particles and forces. But there are questions this model does not answer, such as what happens at the bottom of a black hole, the identity of the dark matter and dark energy that rule the cosmos, or why the universe is matter and not antimatter.

The Large Hadron Collider was built at a cost of some $10 billion, to speed protons around an 18-mile underground track at more than 99 percent of the speed of light and smash them together in search of new particles and forces of nature. By virtue of Einstein’s equivalence of mass and energy, the more energy poured into these collisions, the more massive particles can come out of them. And by the logic of quantum microscopy, the more energy they have to spend, the smaller and more intimate details of nature physicists can see.

Parked along the underground racetrack are a pair of mammoth six-story conglomerations of computers, crystals, wires and magnets: Atlas and C.M.S., each operated by 3,000 physicists who aim to catch and classify everything that comes out of those microscopic samples of primordial fire.

During its first two years of running, the collider fired protons, the building blocks of ordinary matter, to energies of about four trillion electron volts, in the interchangeable units of mass and energy that physicists prefer. By way of comparison, the naked proton weighs in at about one billion electron volts and the Higgs boson is about 125 billion electron volts.

Since June, after a two-year shutdown, CERN physicists have been running their collider at nearly twice the energy with which they discovered the Higgs, firing twin beams of protons with 6.5 trillion electron volts of energy at each other in search of new particles to help point them to deeper laws.

The main news since then has been mainly that there is no news yet, only tantalizing hints, bumps in the data, that might be new particles and signposts of new theories, or statistical demons.

The most intriguing result so far, reported on Tuesday, is an excess of pairs of gamma rays corresponding to an energy of about 750 billion electron volts. The gamma rays, the physicists said, could be produced by the radioactive decay of a new particle, in this case perhaps a cousin of the Higgs boson, which itself was first noticed because it decayed into an abundance of gamma rays.

Or it could be a more massive particle that has decayed in steps down to a pair of photons. Nobody knows. No model predicted this, which is how some scientists like it.

“The more nonstandard the better,” said Joe Lykken, the director of research at the Fermi National Accelerator Laboratory and a member of one of the CERN teams. “It will give people a lot to think about. We get paid to speculate.”

Maria Spiropulu, a professor at Caltech and member of one of the detector teams, said, “As experimentalists, we see a 750-billion-electron-volt beast decaying to two photons.” Explaining it, she added, is up to the theorists.

The new results are based on the analysis of some 400 trillion proton-proton collisions.

If the particle is real, Dr. Lykken said, physicists should know by this summer, when they will have 10 times as much data to present to scientists from around the world who will convene in Chicago, Fermilab’s backyard.

Such a discovery would augur a fruitful future for cosmological wanderings and for the CERN collider, which will be running for the next 20 years. It could also elevate proposals now on drawing boards in China and elsewhere to build even larger, more powerful colliders.

“We are barely coming to terms with the power and the glory” of the CERN collider’s ability to operate at 13 trillion electron volts, Dr. Spiropulu said in a text message. “We are now entering the era of taking a shot in the dark!”

BIG IDEAS: Physics At The Crossroads

This is another in my occasional series on Big Ideas. Last night I had my first opportunity to watch Particle Fever, the acclaimed 2014 documentary on the Large Hadron Collider (LHC) and the discovery of the Higgs Boson particle. This followed my reading of a much more recent New York Times Op-Ed, describing a crisis in physics resulting from the discovery of the Higgs Boson. Essentially, the science of physics has no ability any time in the foreseeable future to experimentally go beyond the Higgs Boson. Physics is unlikely to be able to find The Holy Grail: a unifying Theory of Everything tying Einstein and the Higgs Boson into one simple elegant explanation.


This is another in my occasional series on Big Ideas.  Last night I had my first opportunity to watch Particle Fever, the acclaimed 2014 documentary on the Large Hadron Collider (LHC) and the discovery of the Higgs Boson particle.  I recommend it to everyone. This followed my reading of a much more recent New York Times Op-Ed this week, describing a crisis in physics resulting from the discovery of the Higgs Boson.  Essentially, the science of physics has no ability any time in the foreseeable future to experimentally go beyond the Higgs Boson.  Physics is unlikely to be able to find The Holy Grail: a unifying Theory of Everything tying Einstein and the Higgs Boson into one simple elegant explanation.

A debate has erupted among physicists around the World, regarding the fundamental scientific imperative to empirically verify theories through experiments like those at LHC. But with the scale and complexity of the experiments required outstripping human capability, the question is being raised, “Can we explicitly set aside the need for experimental confirmation of today’s most ambitious cosmic theories — so long as those theories are “sufficiently elegant and explanatory.” The emergence of this debate can clearly be seen in the Particle Fever interviews with various LHC physicists.  While I do understand the quandary we have, my fear is that science could potentially descend into competing belief systems, and give comfort to religious groups who believe the Earth is only 6000 years old. That would be an even greater catastrophe. Any comments or thoughts on this?

Particle Fever, the 2014 award winning documentary on the Large Hadron Collider and the discovery of the Higgs Boson particle. 

READ MORE: NY Times: Crisis At The Edge of Physics

You may think that the answer is an obvious yes, experimental confirmation being the very heart of science. But a growing controversy at the frontiers of physics and cosmology suggests that the situation is not so simple.

A few months ago in the journal Nature, two leading researchers, George Ellis and Joseph Silk, published a controversial piece called “Scientific Method: Defend the Integrity of Physics.” They criticized a newfound willingness among some scientists to explicitly set aside the need for experimental confirmation of today’s most ambitious cosmic theories — so long as those theories are “sufficiently elegant and explanatory.” Despite working at the cutting edge of knowledge, such scientists are, for Professors Ellis and Silk, “breaking with centuries of philosophical tradition of defining scientific knowledge as empirical.”

Whether or not you agree with them, the professors have identified a mounting concern in fundamental physics: Today, our most ambitious science can seem at odds with the empirical methodology that has historically given the field its credibility.

How did we get to this impasse? In a way, the landmark detection three years ago of the elusiveHiggs boson particle by researchers at the Large Hadron Collider marked the end of an era. Predicted about 50 years ago, the Higgs particle is the linchpin of what physicists call the “standard model” of particle physics, a powerful mathematical theory that accounts for all the fundamental entities in the quantum world (quarks and leptons) and all the known forces acting between them (gravity, electromagnetism and the strong and weak nuclear forces).

But the standard model, despite the glory of its vindication, is also a dead end. It offers no path forward to unite its vision of nature’s tiny building blocks with the other great edifice of 20th-century physics: Einstein’s cosmic-scale description of gravity. Without a unification of these two theories — a so-called theory of quantum gravity — we have no idea why our universe is made up of just these particles, forces and properties. (We also can’t know how to truly understand the Big Bang, the cosmic event that marked the beginning of time.)

This is where the specter of an evidence-independent science arises. For most of the last half-century, physicists have struggled to move beyond the standard model to reach the ultimate goal of uniting gravity and the quantum world. Many tantalizing possibilities (like the often-discussed string theory) have been explored, but so far with no concrete success in terms of experimental validation.

Today, the favored theory for the next step beyond the standard model is called supersymmetry (which is also the basis for string theory). Supersymmetry predicts the existence of a “partner” particle for every particle that we currently know. It doubles the number of elementary particles of matter in nature. The theory is elegant mathematically, and the particles whose existence it predicts might also explain the universe’s unaccounted-for “dark matter.” As a result, many researchers were confident that supersymmetry would be experimentally validated soon after the Large Hadron Collider became operational.

But many won’t. Some may choose instead to simply retune their models to predict supersymmetric particles at masses beyond the reach of the Large Hadron Collider’s power of detection — and that of any foreseeable substitute.

Implicit in such a maneuver is a philosophical question: How are we to determine whether a theory is true if it cannot be validated experimentally? Should we abandon it just because, at a given level of technological capacity, empirical support might be impossible? If not, how long should we wait for such experimental machinery before moving on: ten years? Fifty years? Centuries?

Consider, likewise, the cutting-edge theory in physics that suggests that our universe is just one universe in a profusion of separate universes that make up the so-called multiverse. This theory could help solve some deep scientific conundrums about our own universe (such as the so-called fine-tuning problem), but at considerable cost: Namely, the additional universes of the multiverse would lie beyond our powers of observation and could never be directly investigated. Multiverse advocates argue nonetheless that we should keep exploring the idea — and search for indirect evidence of other universes.

The opposing camp, in response, has its own questions. If a theory successfully explains what we can detect but does so by positing entities that we can’t detect (like other universes or the hyperdimensional superstrings of string theory) then what is the status of these posited entities? Should we consider them as real as the verified particles of the standard model? How are scientific claims about them any different from any other untestable — but useful — explanations of reality?

Recall the epicycles, the imaginary circles that Ptolemy used and formalized around A.D. 150 to describe the motions of planets. Although Ptolemy had no evidence for their existence, epicycles successfully explained what the ancients could see in the night sky, so they were accepted as real. But they were eventually shown to be a fiction, more than 1,500 years later. Are superstrings and the multiverse, painstakingly theorized by hundreds of brilliant scientists, anything more than modern-day epicycles?

Just a few days ago, scientists restarted investigations with the Large Hadron Collider, after a two-year hiatus. Upgrades have made it even more powerful, and physicists are eager to explore the properties of the Higgs particle in greater detail. If the upgraded collider does discover supersymmetric particles, it will be an astonishing triumph of modern physics. But if nothing is found, our next steps may prove to be difficult and controversial, challenging not just how we do science but what it means to do science at all.

Why I Hate Dragon’s Den

A local journal today glowingly reported that not one, but two local companies had won investment on the Dragon’s Den Canadian “reality” television show. What struck me about the two, apparently best “winning ideas” from our community, was how utterly mundane they were: an “empty beer bottle handling system” and “illuminated party clothing.” As an entrepreneur myself, I first need to give respect to the two entrepreneurs who achieved this success with the likes of Kevin O’Leary and the other investors. It is no mean feat and they should be acknowledged and congratulated for it. On the other hand, these are not the kind of ideas that are going to make a major dent in the local or Canadian economy. Meanwhile in Vancouver, two startups, D-Wave and General Fusion are working on Big Ideas that could change our lives.


Why I hate Dragon’s Den

 

A local journal today glowingly reported that not one, but two local companies had won investment on the Dragon’s Den Canadian “reality” television show. What struck me about the two, apparently best  “winning ideas” from our community, was how utterly mundane they were: an “empty beer bottle handling system” and “illuminated party clothing.”  As an entrepreneur myself, I first need to give respect to the two entrepreneurs who achieved this success with the likes of Kevin O’Leary and the other investors. It is no mean feat and they should be acknowledged and congratulated for it. On the other hand, these are not the kind of ideas that are going to make a major dent in the local or Canadian economy. Meanwhile in Vancouver, two startups, D-Wave and General Fusion are working on Big Ideas that could change our lives.

Dragon’s Den is nothing more than artificially concocted alleged “reality” TV entertainment. In many cases, the “entertainment value” comes at the expense of the entrepreneurs themselves, some of whom should never have been put on television in the first place. IMHO, this is what is fundamentally wrong with Dragon’s Den. It is pure Fantasyland.  My own UBC entrepreneurship students have also developed similar, and very worthy “small business” ideas.  But as worthy on a small-scale as they may be, these ideas do not further any vision or goal of entrepreneurship’s importance to the Canadian economy.   I judged a graduate student entrepreneurship competition this week which was dominated by Web apps. This is happening in the face of overwhelming evidence that there is very little opportunity or investor interest left in Web apps. Someone recently estimated that there will soon be a Billion Web apps out there. Curiously, Dragon’s Den seems to cull out Web apps entirely, though they must see a lot of them, and prefer to broadcast the eccentric entrepreneurs with really wacky ideas because of their entertainment value.

“Entrepreneurship” has become the current fad, garnering TV viewers and advertiser dollars, and simultaneously conveniently ignoring the bigger issues for the Canadian economy.  Large sums of government dollars are being doled out without adequate oversight as to the return on the investment.  I was recently advised by someone to “follow the government dollars” being  thrown at entrepreneurial incubators.  There seems to be no consideration of the importance of Big Ideas, and solving Big Problems.  Just entertainment for entertainment’s sake, viewer ratings and advertising dollars.

Coming from Silicon Valley, the current Canadian entrepreneurship landscape looks to me like a confused overheated and over invested mess to me.  If I were Kevin O’Leary, I would not be able to live with myself on Dragon’s Den. as if giving a shit only for making his own money equates to some greater economic purpose for Canadians. I prefer to chase Big Ideas.

 

Winfield Man Latest to Do a Deal on Dragons’ Den

Another Okanagan businessman has made a deal in the Dragon’s Den.

Winfield’s Casey Binkley received four offers from the Dragons for his product FastRack that he pitched along with his partner Mitchell Lesbirel.

Casey Binkley (left) and partner Mitchell Lesbirel pitch to the Dragons

The product was invented by Lesbirel to solve the problem that many bars and restaurants have with collecting and clearing their empty bottles after a busy night. Emptying bottles that spill and cause cardboard boxes to tear as the bottles fall out everywhere is a hassle that many in the industry and beyond are familiar with. Lesbirel found a way to solve that problem with a simple plastic rack that allows for draining, easy organization and transfer to cardboard boxes with no mess.

As part of their pitch, the two men ran a fun race that the Dragons participated in as a part of their demonstration of how the product works.

Photo Credit: Facebook

The partners asked the Dragons for $50,000 for 10% of their business and eventually settled on a deal with Jim Treliving. Along with his expertise in the restaurant industry, Treliving offered $50,000 for 5%, 9 months with no royalty, dropping down to 3% after he gained his capital back.

Binkley and FastRack are the second Okanagan company to make a deal with the Dragons in recent weeks, after Kelowna’s Fur Glory appeared on the showwith their special illuminated party clothing.

You can learn more about FastRack on their website and check out their pitch in the video below.

 

 

Quantum Computing Takes Center Stage In Wake of NSA Encryption Cracking

In the late 1990’s, I participated in the creation of the “point-to-point tunneling protocol” (PPTP) with engineers at Microsoft and Cisco Systems, now an Internet Engineering Task Force (IETF) industry standard. PPTP is the technical means for creating the “virtual private networks” we use at UBC, by encrypting “open” Internet packets with IPSEC 128 bit code, buried in public packets. It was an ingenious solution enabling private Internet traffic that we assumed would last for a very long time. It was not to be, as we now know. Most disturbing, in the 1990’s the US Congress debated giving the government the key to all encryption, which was resoundingly defeated. Now, the NSA appears to have illegally circumvented this prohibition and cracked encryption anyway. But this discussion is not about the political, legal and moral issues, which are significant. In this post I am more interested in “so now what do we do?” There may be an answer on the horizon, and Canada is already a significant participant in the potential solution.


In the late 1990’s while I was with Ascend Communications, I participated in the creation of the “point-to-point tunneling protocol” (PPTP) with engineers at Microsoft and Cisco Systems, now an Internet Engineering Task Force (IETF) industry standard.  PPTP is the technical means for creating the “virtual private networks” we use at UBC, by encrypting “open” Internet packets with IPSEC 128 bit code, buried in public packets. It was an ingenious solution, enabling private Internet traffic that we assumed would last for a very long time.  It was not to be, as we now know.  Most disturbing, in the 1990’s the US Congress debated giving the government the key to all encryption, which was resoundingly defeated. Now, the NSA appears to have illegally circumvented this prohibition and cracked encryption anyway. But this discussion is not about the political, legal and moral issues, which are significant.  In this post I am more interested in exploring the question: “So now what do we do?” There may be an answer on the horizon, and Canada is already a significant participant in the potential solution.

As it happens, Canada is already at the forefront of quantum computing, a critically important new area of research and development, that has significant future potential in both computing and cryptography.  I have previously written about Vancouver-based D-Wave, which has produced commercial systems that have been purchased by Google and Lockheed Martin Aerospace.  The Institute for Quantum Computing in Waterloo, Ontario is the other major center of quantum computing research in Canada. Without taking a major diversion to explain quantum mechanics and its applications in computing and cryptography, there is a great PBS Nova broadcast, available online, which provides a basic tutorial.  The Economist article below, also does an admirable job of making this area understandable, and the role that the Waterloo research centre is playing in advancing cryptography to an entirely new level.

We need to insure that Canada remains at the forefront of this critically important new technology.

Cryptography

The solace of quantum

Eavesdropping on secret communications is about to get harder

  • CRYPTOGRAPHY is an arms race between Alice and Bob, and Eve. These are the names cryptographers give to two people who are trying to communicate privily, and to a third who is trying to intercept and decrypt their conversation. Currently, Alice and Bob are ahead—just. But Eve is catching up. Alice and Bob are therefore looking for a whole new way of keeping things secret. And they may soon have one, courtesy of quantum mechanics.

At the moment cryptography concentrates on making the decrypting part as hard as possible. The industry standard, known as RSA (after its inventors, Ron Rivest, Adi Shamir and Leonard Adleman, of the Massachusetts Institute of Technology), relies on two keys, one public and one private. These keys are very big numbers, each of which is derived from the product of the same two prime numbers. Anyone can encrypt a message using the public key, but only someone with the private key can decrypt it. To find the private key, you have to work out what the primes are from the public key. Make the primes big enough—and hunting big primes is something of a sport among mathematicians—and the task of factorising the public key to reveal the primes, though possible in theory, would take too long in practice. (About 40 quadrillion years with the primes then available, when the system was introduced in 1977.)

Since the 1970s, though, the computers that do the factorisation have got bigger and faster. Some cryptographers therefore fear for the future of RSA. Hence the interest in quantum cryptography.

Alice, Bob and Werner, too?

The most developed form of quantum cryptography, known as quantum key distribution (QKD), relies on stopping interception, rather than preventing decryption. Once again, the key is a huge number—one with hundreds of digits, if expressed in the decimal system. Alice sends this to Bob as a series of photons (the particles of light) before she sends the encrypted message. For Eve to read this transmission, and thus obtain the key, she must destroy some photons. Since Bob will certainly notice the missing photons, Eve will need to create and send identical ones to Bob to avoid detection. But Alice and Bob (or, rather, the engineers who make their equipment) can stop that by using two different quantum properties, such as the polarities of the photons, to encode the ones and zeros of which the key is composed. According to Werner Heisenberg’s Uncertainty Principle, only one of these two properties can be measured, so Eve cannot reconstruct each photon without making errors. If Bob detects such errors he can tell Alice not to send the actual message until the line has been secured.

One exponent of this approach is ID Quantique, a Swiss firm. In collaboration with Battelle, an American one, it is building a 700km (440-mile) fibre-optic QKD link between Battelle’s headquarters in Columbus, Ohio, and the firm’s facilities in and around Washington, DC. Battelle will use this to protect its own information and the link will also be hired to other firms that want to move sensitive data around.

QuintessenceLabs, an Australian firm, has a different approach to encoding. Instead of tinkering with photons’ polarities, it changes their phases and amplitudes. The effect is the same, though: Eve will necessarily give herself away if she eavesdrops. Using this technology, QuintessenceLabs is building a 560km QKD link between the Jet Propulsion Laboratory in Pasadena, California, which organises many of NASA’s unmanned scientific missions, and the Ames Research Centre in Silicon Valley, where a lot of the agency’s scientific investigations are carried out.

A third project, organised by Jane Nordholt of Los Alamos National Laboratory, has just demonstrated how a pocket-sized QKD transmitter called the QKarD can secure signals sent over public data networks to control smart electricity grids. Smart grids balance demand and supply so that electricity can be distributed more efficiently. This requires constant monitoring of the voltage, current and frequency of the grid in lots of different places—and the rapid transmission of the results to control centres. That transmission, however, also needs to be secure in case someone malicious wants to bring the system down.

In their different ways, all these projects are ambitious. All, though, rely on local fixed lines to carry the photons. Other groups of researchers are thinking more globally. To do that means sending quantum-secured data to and from satellites.

At least three groups are working on this: Thomas Jennewein and his team at the Institute for Quantum Computing in Waterloo, Canada; a collaboration led by Anton Zeilinger at the University of Vienna and Jian-Wei Pan at the University of Science and Technology of China; and Alex Ling and Artur Ekert at the Centre for Quantum Technologies in Singapore.

Dr Jennewein’s proposal is for Alice to beam polarisation-encoded photons to a satellite. Once she has established a key, Bob, on another continent, will wait until the satellite passes over him so he can send some more photons to it to create a second key. The satellite will then mix the keys together and transmit the result to Bob, who can work out the first key because he has the second. Alice and Bob now possess a shared key, so they can communicate securely by normal (less intellectually exhausting) terrestrial networks. Dr Jennewein plans to test the idea, using an aircraft rather than a satellite, at some point during the next 12 months.

An alternative, but more involved, satellite method is to use entangled photon pairs. Both Dr Zeilinger’s and Dr Ling’s teams have been trying this.

Entanglement is a quantum effect that connects photons intimately, even when they are separated by a large distance. Measure one particle and you know the state of its partner. In this way Alice and Bob can share a key made of entangled photon pairs generated on a satellite. Dr Zeilinger hopes to try this with a QKD transmitter based on the International Space Station. He and his team have been experimenting with entanglement at ground level for several years. In 2007 they sent entangled photon pairs 144km through the air across the Canary Islands. Dr Ling’s device will test entanglement in orbit, but not send photons down to Earth.

If this sort of thing works at scale, it should keep Alice and Bob ahead for years. As for poor Eve, she will find herself entangled in an unbreakable quantum web.

From the print edition: Science and technology