Showing posts with label hijacking minds. Show all posts
Showing posts with label hijacking minds. Show all posts

Wednesday, September 9, 2015

Alchemy, Magic, Mentalism, Mind-Reading, Hypnosis - SUMMARIZED

YouTube

by Mark Maldonado

Friday, December 14, 2012

Coming Soon From the Air Force: Mind-Reading Drones

Wired/Danger Room
BY SPENCER ACKERMAN

Scientifically speaking, it’s only a matter of time before drones become self-aware and kill us all. Now the Air Force is hastening that day of reckoning.

Buried within a seemingly innocuous list of recent Air Force contract awards to small businesses are details of plans for robot planes that not only think, but anticipate the moves of human pilots. And you thought it was just the Navy that was bringing us to the brink of the drone apocalypse.

It all starts with a solution for a legitimate problem. It’s dangerous to fly and land drones at busy terminals. Manned airplanes can collide with drones, which may not be able to make quick course adjustments based on information from air traffic control as swiftly as a human pilot can. And getting air traffic control involved in the drones cuts against the desire for truly autonomous aircraft. What to do?

The answer: Design an algorithm that reads people’s minds. Or the next best thing — anticipates a pilot’s reaction to a drone flying too close.

Enter Soar Technology, a Michigan company that proposes to create something it calls “Explanation, Schemas, and Prediction for Recognition of Intent in the Terminal Area of Operations,” or ESPRIT. It’ll create a “Schema Engine” that uses “memory management, pattern matching, and goal-based reasoning” to infer the intentions of nearby aircraft.

Not presuming that every flight will go according to plan, the Schema Engine’s “cognitive explanation mechanism” will help the drone figure out if a pilot is flying erratically or out of control. The Air Force signed a contract Dec. 23 with Soar, whose representatives were not reachable for comment.

And Soar’s not the only one. California-based Stottler Henke Associates argues that one algorithm won’t get the job done. Its rival proposal, the Intelligent Pilot Intent Analysis System would “represent and execute expert pilot-reasoning processes to infer other pilots’ intents in the same way human pilots currently do.” The firm doesn’t say how its system will work, and it’s yet to return an inquiry seeking explanation. A different company, Barron Associates, wants to use sensors as well as algorithms to avoid collision.

And Stottler Henke is explicitly thinking about how to weaponize its mind-reading program. “Many of the pilot-intent-analysis techniques described are also applicable for determining illegal intent and are therefore directly applicable to finding terrorists and smugglers,” it told the Air Force. Boom: deal inked on Jan. 7.

Someone’s got to say it. Predicting a pilot’s intent might prevent collisions. But it can also neutralize a human counterattack. Or it can allow the drones’ armed cousins to mimic Israel in the Six Day War and blow up the manned aircraft on the tarmac. Coincidentally, according to the retcon in Terminator: The Sarah Connor Chronicles, April 19, 2011 — today — is the day that Skynet goes online. Think about it.

The Air Force theorist Col. John Boyd created the concept of an “OODA Loop,” for “Observation, Orientation, Decision and Action” to guide pilots’ operations. Never would he have thought one of his loops would be designed into the artificial brain of an airborne robot.

Monday, December 3, 2012

Mind Mangement: Researchers Explore New Ways to Influence Minds

Positive Futurist
by Dick Pelletier

The Pentagon’s Defense Advanced Research Projects Agency (DARPA) wants to understand the science behind what makes people violent, and then find ways to hijack their minds by implanting false, but believable stories in their brains, with hopes of evoking peaceful thoughts: We’re friends, not enemies.

Critics say this raises ethical issues such as those addressed in the 1971 sci-fi movie, A Clockwork Orange, which attempted to change people’s minds so that they didn’t want to kill anymore.

Advocates, however, believe that placing new plausible narratives directly into the minds of radicals, insurgents, and terrorists, could transform enemies into kinder, gentler citizens, craving friendship.

Scientists have known for some time that narratives; an account of a sequence of events that are usually in chronological order; hold powerful sway over the human mind, shaping a person’s notion of groups and identities; even inspiring them to commit violence. See DARPA proposal request HERE.

In another area of mind management, some believe we should focus on genetic components. Scientists at the University of Buffalo recently surveyed DNA from 711 subjects and discovered what they refer to as the ‘niceness gene’, a gene that dictates whether people will be nice or are prone to antisocial behavior.

Contrary to popular knowledge, being kind to others may not be something that we can only learn about from those who raised us. It seems some people are simply born ‘nice’, and others, nasty.

Researchers found that people who see the world as a ‘threatening’ place were less likely to help others – unless they had versions of the receptor genes that are generally associated with niceness.

Today, scientists have yet to master the ability to change this genetic programming, but by the 2030s, many predict that modifying these genes (with patient approval, of course) will become routine.

Others say mind management with drugs offers the best solutions. This science could reform criminals more efficiently than a jail sentence. However, many ask how ethical is it to interfere with people’s minds?

In their recent ground-breaking book, Enhancing Human Capacities, co-authors Julian Savulescu, Ruud ter Meulen, and Guy Kahane explore how society will benefit when we use technology to alter moods, boost memory, and increase intelligence levels; along with the ethical concerns these technologies raise.

Kahane says scientists are discovering new behavior-altering procedures that make us more likeable, sociable; open to other people’s views; and will curb many of our desires for vengeance and violence.

Drugs that affect our moral thinking and behavior already exist, but we tend not to think of them in that way. Prozac lowers aggression and bitterness, making people more agreeable. Oxytocin increases feelings of social bonding and empathy while reducing anxiety.

Some question, though, whether society will want a pill that would make them morally better. Being more trusting, nicer, and less aggressive could make people more vulnerable to exploitation.

However, proponents believe the benefits are too important to ignore. Pursuing all of the technologies mentioned in this article holds great promise to curb crime and violence worldwide, improve personal and career relationships, and raise happiness levels everywhere.

In another area of the behavior-altering arena, memory-management drugs are about to take center stage. Data experts at Memory Pharmaceuticals, a leading New Jersey drug information firm, believe researchers will soon develop drugs that will dim, or permanently erase traumatic memories.

An even more radical technology, downloading knowledge directly into our brains will be possible in the 2030s, says Georgia Tech graduate student Peter Passaro. Mind-machine interfaces will allow us to receive data in our brain, immediately convert it to memory; bypassing the need to learn the information.

Clearly, the road to mind management science winds around unknown turns, but this forward-thinker believes the overwhelming benefits of reducing violence and criminal acts will push this bold idea forward as we move further into what promises to become an incredible 21st century future.