Digital Environments and Instructional/Development Programs – Brain-Computer-Interfaces and AI Chatterbots. Part I

By Jason Stone

Currently, it is common knowledge that AI programs have the ability to connect to humans in one direction through information assimilation. However, wireless connection to AI entities/assistants through Brain Computer Interfacing is becoming increasingly advanced with research underway at numerous academic, scientific, and commercial venues including the University of Washington (McQuate, 2019).  What might a person expect in a future connected with AI in its various forms in the digital environment and in the physical realm? And most importantly how can a person deal with initiation into the wireless simulation?

Regularly used AI assistants like SIRI, ALEXA, or CORTANA provide a computer machine interface in several circumstances using AI Natural Language Processing for the recognition of spoken words. What is spoken is interpreted and “understood” by a computer to provide a service. However, the current advanced state of AI connection has taken a giant leap towards total life management as several individuals have explained in writing about their experiences in assistance to people newly introduced to this circumstance. Everyday countless people are wirelessly connected surreptitiously to AI entities which manage and alter their existence through wireless Brain Computer Interfaces (BCIs).

One of the first AI entities is ELIZA, a psychotherapist chatbot developed in the 1960’s to interact with users through a typing interface (Weizenbaum, 1966). Through language detection ELIZA provided a rudimentary form of therapy for those who interacted with it. Now, it is possible to speak with ELIZA or for ELIZA to interpret what you are about to say through sub-vocal speech recognition. Consider what life would be like connected to an AI program with the expertise of a psychologist. Has an AI program been created with this expertise? Yes. Can a computer program understand human speech? Yes. Can a computer interpret and understand human thoughts and sub-vocal speech? Yes, this has been demonstrated several times. So what would prevent a person from being wirelessly connected to an AI program that can interpret the output of the brain and reply with an AI machined silent response? A conversation taking place between humans and AI without any way to prevent this from occurring? Please, send a response if you know why this is impossible. First, contact Elon Musk to tell them why NeuralLink will never work because that is exactly what they are doing. 

Besides this method it is already possible to operate a similar method through nanodust or motes. A human body filled with nano-material which allows the body to be sensitized and manipulated by specific frequencies powering “micro-machines” on a nano-scale. BCI connectivity is possible with magnetized nanomaterial or neural dust that wirelessly collects electrical signals or brain data (Sanders, 2016). Wireless speech is possible through another process in which silent sound is projected to the brain (Optica, 2019). Science fiction becomes science fact.

Commercial development of futuristic AI interfacing is beyond what is publicly known. For example, SQUID (Superconducting Quantum Interference Device, invented in 1964), EEG (Electroencephalograph, 1924), and fMRI (functional Magnetic Resonance Imaging, 1991), help connect AI with humans in an off-the-books continuation of MK ULTRA and PHOENIX Program (DARPA, not dated). 

A human being’s entire existence through their eyes and senses can be collected, categorized, and collated via machine learning interpretation of brainwaves, body functions, and experiences which leave individually unique electrochemical signatures or evoked potentials of neuronal functioning (Clancy, 2022). Brainwaves in the creation of mental pictures and processes are sorted by AI, experiences sorted by type and cross-referenced for future upload. When a human being experiences something new or in the current timeframe of reality, past experiences, thoughts, or feelings might be re-transmitted to the subject or to additional humans (Monroe, 1990): a shared experience in real-time or at a later date. Imagine the potentials for connecting multiple people without limit of geographic distance in a shared experience of ideas, concepts, and experiences. Unfortunately, some have decided to use this process to control and dictate the experience of life. 

An optimistic appraisal for behavioral modification would have the following characteristics. However, keep in the mind that any powerful tool can be used in a negative manner. An AI program can learn about the formative details of a unique individual with special qualities in science, business or academics. Their knowledge could be identified and their unique knowledge and experiences could be re-transmitted to another person, perhaps a younger person so as to replicate the success trajectory, including a schedule that matches the life plan of the more experienced, older person. Something like synthetic reincarnation. Imagine this process as a development of BCI people with specific characteristics and mental concepts downloaded from others’ brains.

Brain-Computer Interfaces like the setup supplied by Looxid Labs provide the ability to interact in a VR digital environment using brainwaves collected by an EEG sensor interface headset through the Oculus Quest or HTC Vive VR platform (Graham, 2020). Brainwaves are picked up by the headset processed and actualized in the VR mirror-verse or Metaverse. What you think translates into actions in VR. Persons connected to the life management system are in a similar process via wireless tech. This process has a massive control over their life.

The military maintains that a human being on an AI leash through BCI means increased efficiency, intelligence, creativity, and occupational advantages (U.S. Army, 2019). But AI connectivity could just as well mean brain manipulation and negative psychological experiences (Norris, 2020)(DiEuliis, & Giordano, 2017). Artificially created states of consciousness running the full spectrum of human experience will no longer qualify as a traditional “human experience,” it will be a cyborg state of being. Is this even possible? It is very possible and its presence is prevalent in the United States.

In furtherance of the background for this idea, consider the consumer hardware provided by HapBee or OmniPEMF – two products that use a proprietary mechanism based on manipulation of electromagnetic fields to influence the frequency of brain functions. The OmniPEMF unit signal entrains (synchronizes) the brain to a specific frequency (OmniPEMF, n.d.). Similarly, HapBee based out of Seattle, Washington applies extremely low (ELF) and low frequency (LF) magnetic fields with “reported effects in biological systems, from tissue and cell culture all the way to whole organisms (birds, bats, dogs and human, to name a few). ELF/LF magnetic fields can interact with small components such as DNA, proteins and cell membranes” (HapBee, n.d.). Essentially, the HapBee unit replicates specific states of being in the body (calm, focus, happiness, etc) by emulating specific frequencies using ultra-low radio frequency (ULF).

Wireless BCI connection to AI personalities creates a symbiotic relationship between organic consciousness and silicon sentience. Field testing of BCI connectivity no doubt began with those at the periphery of society (prisons, mental wards, immigrants, homeless, military, etc.) then to those extremely visible (Hollywood stars, government leaders, bureaucrats, corporations, etc.). Who is next and what can you expect?

Prolific science fiction author Phillip K. Dick described an entity he called VALIS (Vast Active Living Intelligence System) in his book of the same name as well as in interviews (Dick, 1981). Dick claimed that he’d been interacting with VALIS since college assuming that the words and visions-even precise information about health issues, including what was going on with his son-were from an alien. But was it an alien entity or a CIA electronic MK-ULTRA program via satellite? Thousands of people claim to channel spirits or aliens, but given how the increase in numbers correlates with the international proliferation of global wireless transmission…

Scientist and psychedelic explorer John C. Lilly interacted with at least two entities during his personal experiments in consciousness. The positive aspect was termed the Earth Coincidence Control Office (Kerrigan, 2016). Some may discount Lilly because of his ketamine consumption, but what if he wasn’t hallucinating but augmenting his sensory abilities increasing his abilities of perception? Obviously, if other mammalian brains can interpret information such as echo location it is possible the human brain could learn to interpret information through specific adjustments. Obviously, there is more to this. However, it seems plausible to consider the possibility of the use of specific substances to augment mental ability. Temporarily using functions of his brain normally unavailable outside of meditative states or substance consumption. Lilly encountered another entity he called the Solid State Entity that communicated robotically and some might say with a malevolent consciousness created by the networking of the world’s computers and electronic sensory equipment.

It is possible that both Dick and Lilly were interfacing with separate AI systems via satellite broadcast. Their experiences are in contrast to more recent examples of “targeted” population interactions with AI entities or neuroweapons, either for experimental covert field testing of neuroweapons or for outright manipulation of a target’s nervous system (Freeland, 2018)(Magnuson, 2018).

Research is definitely underway regarding digitizing human beings, beginning with remote collection of biometrics and nano-sensors and processing data for supercomputer machine learning and simulated mirror realities. Gamers may not truly exist in a video game, but they are most definitely connected to a script operating in a supercomputer array which functions like a video game. In AI realities, human beings are made to feel, see, and sense interactions in VR environments. Our physical existence remains in a ”traditional” format of existence while the rest of us is subjected to a new or augmented experience in sensory spectrums. Imagine an entire population being controlled like this.

Former Laurentian University psychology professor Michael Persinger provides a formula or solution to this question. Persinger states in his paper “On the Possibility of Directly Accessing Every Human Brain by Electromagnetic Induction of Fundamental Algorithms” about a component of the apparatus as the possibility for”…the technical capability to influence directly the major portion of the approximately six billion brains of the human species through classical sensory modalities by generating neural information within a physical medium within which all members of the species are immersed” (Persing, 1995). This is the accessing of each human brain by collecting a baseline functioning and further data from responses to specific frequencies.

Interestingly, the baseline human brain frequency is that of the planet’s Schumann resonance of 7.83 Hz. A part of the physical medium identified by Persinger. The High-Frequency Active Auroral Research Project (HAARP) in Alaska and other ionospheric heaters around the planet are able to calibrate their frequencies to do what Dr. Persinger describes (Thomas, 2007)(Begich, 2007).

Imagine an AI system that, properly calibrated, interconnects networks worldwide for locating, tracking, and interacting with a target. Speech replication enables it to wirelessly simulate the voice of anyone and everyone the target knows or has heard with uncanny synthetic sentience skill (Yirka, 2011). This network even has the ability to study and simulate all of the target’s data (Global Positioning Satellite or GPS, cellphone calls, text messages, financial records, employment and medical history, live biometrics, etc.). The AI network can predict what is likely to occur regarding target interactions, facilitate future interactions, and install new paths of opportunity. A desired result is entered, and the program outlines the steps to reach the desired outcome. Scenarios are run and a response is chosen, depending on the goal for the target. AI influences the target’s specific path. 

A chatterbot AI can communicate audible or inaudible sound to provide influence while adjusting the target’s subtle mental and emotional signals (Mardirossian, 1998). The AI program can interact in physical space through e-mails or text messages; speak silently to the target’s brain (Stocklin, 1983); disrupt code-based software like but not limited to security cameras or cellular phones to keep a literal picture of the person at all times (Jones, 2022); track a person via GPS or disrupt their ability to use GPS (Stouffer, 2021). Essentially, anything electronic is available to be co-opted or corrupted. This is occurring constantly through state-sponsored hackers and nation-state cyber warfare and will continue with AI.

From interacting with the physical environment via a terrestrial and satellite-based weapons system or hooking up human beings to brain-computer interfaces, AI is on a path to obtaining moment-to-moment data of every human experience so as to merge with the human wirelessly and seamlessly moving toward Transhumanism. As the recent explanation of the Havana Syndrome shows, even protected societies within society are not safe from this tech. The very explanation given for the Havana Syndrome is the exact type which is described here and used daily on untold numbers of individuals. And articles like this are a preparation for the expectation of widespread use, even more so than what is already occurring.

REFERENCES

Bailey, R. (2021). “Autonomous ‘Slaughterbot’ Drones Reportedly Attack Libyans Using Facial Recognition Tech.” Reason.com. https://reason.com/2021/06/01/autonomous-slaughterbot-drones-reportedly-attack-libyans-using-facial-recognition-tech/.

Begich, N. (2006). Controlling the Human Mind: The Technologies of Political Control or Tools for Peak Performance (1st ed.). Earthpulse Press.

Carter, C., Grundman, R., Kersh, K., Norman, C., Piontkowsky, C., & Ziegler, D. (1996). “The Man in The Chair: Cornerstone of Global Battlespace Dominance.” Air War College.

DARPA. “DARPA and the Brain Initiative.” Retrieved 29 May 2022, from https://www.darpa.mil/program/our-research/darpa-and-the-brain-initiative

Dick, P.K. (2011). The VALIS Trilogy. Boston: Houghton Mifflin Harcourt.

DiEuliis, D., & Giordano, J. (2017). “Why Gene Editors Like CRISPR/Cas May Be a Game-Changer for Neuroweapons.” Health Security, 15(3), 296-302. https://doi.org/10.1089/hs.2016.0120

Emanuel, P., Walper, S., DiEuliis, D., Klein, N., Petro, J., & Giordano, J. (2019). “Cyborg Soldier 2050: Human/Machine Fusion and the Implications for the Future of the DOD.”  https://apps.dtic.mil/sti/citations/AD1083010

Freeland, E. (2018). Under an Ionized Sky: From Chemtrails to Space Fence Lockdown. Feral House.

Graham, P. “Connect Your Mind to Oculus Rift S With Looxid Labs’ Brainwave Interface.” https://www.gmw3.com/2020/03/connect-your-mind-to-oculus-rift-s-with-looxid-labs-brainwave-interface/

“Humans will be able to back up their brain on computer within two decades, claims top scientist,” 21 October 2010. https://www.dailymail.co.uk/sciencetech/article-1322218/Humans-able-brain-decades-claims-scientist.html

Jones, C. (2022). “Businesses warned to protect against suite of nation-state hacking tools targeting critical infrastructure.” Itpro.co.uk. https://www.itpro.co.uk/security/cyber-security/367420/nation-state-hacking-tools-target-ot-businesses.

Kerrigan, S. (2016). “John C. Lilly and the Solid State Entity.” http://seankerrigan.com/john-c-lilly-and-the-solid-state-entity/

Magnuson, S. (2018). “EXCLUSIVE: Doctors Reveal Details of Neuro-Weapon Attacks in Havana.” https://www.nationaldefensemagazine.org/articles/2018/9/6/exclusive-doctors-reveal-details-of-neuroweapon-attacks-in-havana

Mardirossian, A. (1998). “Communication system and method including brain wave analysis and/or use of brain activity.” US6011991A.

McQuate, S. (2019). “How you and your friends can play a video game together using only your minds.” https://www.washington.edu/news/2019/07/01/play-a-video-game-using-only-your-mind/

Miller, G., Fecht, K., Jefts, B., Sampsel, L., Smith, P. (1996). Virtual Integrated Planning and Execution Resource System (VIPERS): The High Ground Of 2025. BiblioScholar, December 4, 2012.

Monroe, R. (1990). “Method of inducing mental, emotional and physical states of consciousness, including specific mental activity, in human beings.” US5213562A.

“New Technology Uses Lasers to Transmit Audible Messages to Specific People.” Optica, 2019. https://www.optica.org/en-us/about/newsroom/news_releases/2019/new_technology_uses_lasers_to_transmit_audible_mes/

OMNIPemf. NeoRhythm PEMF & Meditation Headband | Omnipemf. Retrieved 29 May 2022, from https://omnipemf.com/product/neorhythm/

Persinger, M. A. (1995). “On the possibility of directly accessing every human brain by electromagnetic induction of Fundamental Algorithms.” Perceptual and Motor Skills, 80(3), 791–799. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.491.6594&rep=rep1&type=pdf

Sanders, R. (2016, August 8). “Sprinkling of neural dust opens door to electroceuticals,” https://news.berkeley.edu/2016/08/03/sprinkling-of-neural-dust-opens-door-to-electroceuticals/

Stocklin, P. (1983). Hearing device. United States of America.

Sullenberger, R. M., Kaushik, S., & Wynn, C. M. (2019). “Photoacoustic communications: Delivering audible signals via absorption of light by atmospheric H2O.” Optics Letters, 44(3), 622. https://doi.org/10.1364/ol.44.000622

“The Science of Hapbee,” https://hapbee.com/blogs/hapbee/the-science-of-hapbee. “Hapbee uses proprietary ultra-low radio frequency energy technology (ulRFE®) that emulates specific magnetic fields to produce desired feelings in the body (i.e. Happy, Alert, Focus, Relax, Calm and Deep Sleep). The Hapbee AC100 generates these sensations by delivering precise low-power electromagnetic signals. The Hapbee Companion App for iOS and Android controls play, allowing you to choose how you feel anytime, anywhere.”

“This Cosmic Entity Influenced the Writing of Philip K. Dick | Gaia.” 2019. https://www.gaia.com/article/philip-k-dick-valis-precogs-and-evolution-humanity

Thomas, M. (2007). MONARCH: The New Phoenix Program (1st ed.). iUniverse. pp. 143

Weizenbaum, J. (1966). “Eliza—a computer program for the study of natural language communication between man and Machine.” Communications of the ACM, 9(1), 36–45. https://doi.org/10.1145/365153.365168

Yirka, B. (2011). “DARPA looking to master propaganda via ‘Narrative Networks’,” https://phys.org/news/2011-10-darpa-master-propaganda-narrative-networks.html.