D-WAVE QUANTUM COMPUTER IS A STARGATE HIVE MIND PORTAL TO THE UNIVERSE INSIDE YOUR BRAIN

1 year ago
697

AMAZON, GOOGLE, LOCKHEED MARTIN, USC, LOS ALAMOS NATIONAL LABORATORY, AND NASA ARE USING THE D-WAVE QUANTUM COMPUTER TO CREATE THE NEW VIRTUAL WORLD ORDER.
A world where every thought, feeling, perception is recorded. A world where digital artificial intelligent Autonomous agents can see through your eyes, hear through your ears, and control your thoughts and behavior. They travel over the internet at the speed of light and use the brain as a portal to perceive and experience the real world. They can become embodied in the mind without a persons knowledge or consent or as in the case of many Targeted Individuals they can make their presence obviously known.
THE FOLLOWING EXCERPT IS FROM AN ARTICLE PUBLISHED BY WIRED MAGAZINES KEVIN KELLY:
The first big technology platform was the web, which digitized information, subjecting knowledge to the power of algorithms; it came to be dominated by Google. The second great platform was social media, running primarily on mobile phones. It digitized people and subjected human behavior and relationships to the power of algorithms, and it is ruled by Facebook and WeChat.
We are now at the dawn of the third platform, which will digitize the rest of the world. On this platform, all things and places will be machine-­readable, subject to the power of algorithms. Also, like its predecessors, this new platform will unleash the prosperity of thousands more companies in its ecosystem, and a million new ideas—and problems—that weren’t possible before machines could read the world.
To recreate a map that is as big as the globe—in 3D, no less—you need to photograph all places and things from every possible angle, all the time, which means you need to have a planet full of cameras that are always on.
We are making that distributed, all-seeing camera network by reducing cameras to pinpoint electric eyes that can be placed anywhere and everywhere. The mirrorworld will be a world governed by light rays zipping around, coming into cameras, leaving displays, entering eyes, a never-­ending stream of photons painting forms that we walk through and visible ghosts that we touch. The laws of light will govern what is possible.
New technologies bestow new superpowers. The mirrorworld promises super vision. We’ll have a type of x-ray vision able to see into objects via their virtual ghosts, exploding them into constituent parts, able to untangle their circuits visually. It will be the Photonic Era.
But here’s the most important thing: Robots will see this world. Indeed, this is already the perspective from which self-driving cars and robots see the world today, that of reality fused with a virtual shadow. When a robot is finally able to walk down a busy city street, the view it will have in its silicon eyes and mind will be the mirrorworld version of that street. The robot’s success in navigating will depend on the previously mapped contours of the road—existing 3D scans.
Of course, like all interactions in the mirrorworld, this virtual realm will be layered over the view of the physical world, so the robot will also see the real-time movements of people as they walk by. Much of the real-time digitization of moving things will be done by other cars as they drive around themselves, because all that a robot sees will be instantly projected into the mirrorworld for the benefit of other machines. When a robot looks, it will be both seeing for itself and providing a scan for other robots.
In the mirrorworld too, virtual bots will become embodied; they’ll get a virtual, 3D, photorealistic shell, whether machine, animal, human, or alien. Inside the mirror­world, agents like Siri and Alexa will take on 3D forms that can see and be seen. Their eyes will be the embedded billion eyes of the matrix. They will be able not just to hear our voices but also, by watching our virtual avatars, to see our gestures and pick up on our microexpressions and moods. Their spatial forms—faces, limbs—will also increase the nuances of their interactions with us. The mirrorworld will be the badly needed interface where we meet AIs, which otherwise are abstract spirits in the cloud.

DR. BILL CASBEER:
Research Area Manager in Human Systems and Autonomy for Lockheed Martin’s Advanced Technology Laboratories, where he leads programs to improve human performance and the ability of people and autonomous technology to work together on teams.
Bill served as a Program Manager at the DARPA from 2010-14, where he established DARPA’s neuroethics program.

ROBIN HANSON (AGE OF EM):
Associate professor of economics at George Mason University. Ph.D. in social science from Caltech He is known as an expert on idea futures and markets, and he was involved in the creation of the Foresight Institute's Foresight Exchange and DARPA’s FutureMAP project. Researched artificial intelligence, Bayesian statistics and hypertext publishing at Lockheed and NASA,

Loading comments...