Visit our newest sister site!
Hundreds of free aircraft flight manuals
Civilian • Historical • Military • Declassified • FREE!

TUCoPS :: Wetware Hacking :: Others :: 3views.txt

More "Mind Machines" reviewed

 by J. Brad Hicks
 Special to MIND-L and CIS:NEWAGE


   While I was out in San Francisco on business, I visited the
original "mind spa," Randall Adamadama's shop, Universe of You.  Randy
now offers only two services: a 45 minute light and sound mind machine
trip (see my previous buyer's guide to light and sound mind machines)
for $10, or the same with the addition of cranial electrical
stimulation (CES) for $15.
   The "light" and some of the "sound" part of the program comes from
a Mind Gear Innervision PR-2 outfitted with InnerQuest IQ-9110 goggles
(which Randy uses because they let the wearer manually adjust
brigthness, via a small dial on the right temple). The audio portion
of both types of session is mixed with his own custom blend of new age
music and synthesized "natural" sounds.  The CES is provided by a
Light & Sound Turbocharger slaved to the PR-2.
   Both services come in three flavors, indentified only as "low,"
"medium," and "high."  Adamadama is closed-mouthed about the actual
sequence and content of the sessions, but when pressed, describes the
low sessions as oriented towards the upper delta to lower theta range
of wavelengths;  high, in the range from upper alpha to high beta; and
medium, an introductory trip covering a wide range of frequencies.
Since I was on my way back to a technical trade show, I opted for the
45 minute "high" session with CES.
   My first discovery about CES:  at even fairly mild voltages, those
earclip-style electrodes HURT, almost exactly like poking your
earlobes with needles over and over again.  (I haven't tried it, but I
hypothesize that this may be less of a problem with headband or behind
the ear electrodes.)  When I complained, Randy moved quickly to turn
down the power level.  According to him, the ideal voltage is just at
the point where you can barely feel it.
   For those of you who don't have the brainwave frequency ranges
memorized, a session that started at high alpha (I'm guessing around
12 Hz) and worked up towards high beta (25 to 30 Hz, maybe) should
have left me fidgety and jittery, and full of ideas.  Indeed, I
usually get just that sensation from running similar (if shorter)
programs on my DAVID Paradise at home.  But instead, I fell asleep
about halfway through the session, and kept right on snoring until the
sudden end of the session woke me up.
   One experience does not a full evaluation of the technical
possibilites of CES make, I realize.  What's more, I begin to wonder
if Mr. Adamadama's chose of accompaniment music wasn't a contributing
factor: if I were going to pick music to wake up my mind, it wouldn't
be new age.  Nor am I sure that the simple invariate square waves of
the Light & Sound Turbocharger are the best way to influence the
brain. So at my next opportunity, I think I want to test something
more like Bob Beck's Brain Tuner series of CES hardware.


   Back at the trade show the next day, I got a chance to test drive a
brainwave biofeedback device called the Interactive Brainwave Visual
Analyzer, or IBVA, by Psychic Lab, Inc.  This is a combination of a
wireless EEG sensor worn in a headband, sending signals to a reciever
connected to a Macintosh computer.  Software running on the Mac does a
fast Fourier transform on the output and displays a running 3-D graph
of your brainwave activity: the horizonal axis is time (right to
left), the axis pointed towards you is brainwave frequency from 0 to
30 Hz, and the vertical axis is relative energy.  With only one sensor
(the configuration they were showing), you position it with the sensor
in the back of your head; I would guess that with two you would put
one over each ear and be able to see separate graphs for each
   That 3-D graph is damnably hard to read.  In fact, the sheer
intellectual effort involved may have colored my results; please keep
this in mind.  When my turn at the hardware came up, I let them put
the headband on me and adjust it, then started my standard meditation
exercises while watching the screen.  No matter what I tried, the
output stayed the same: VERY high beta, some delta, nothing much else.
Then I remembered that a lot of biofeedback clinics don't let the
subject see the output until after the full set of exercises, so I
stared off into space above the screen and tried again.  When I looked
back, I saw a slight increase in delta energy but not much other
   Now, I've been meditating pretty frequently since 1973, and my
subjective experience does NOT match what I saw on the IBVA screen. I
can make three hyoptheses but can't figure out any way to decide
between them without more evidence than I've got.  Either (1) I'm not
one tenth the meditator I think I am, or (2) their hardware and
software was not giving reliable readings, or (3) I was reading the
output wrong, due to the confusing overload badly labelled graphs.
   I may not have been impressed, but an awful lot of people were.
Psychic Labs' booth had one of the longer lines I saw at the show. On
the other hand, most of the people I saw put on the sensors didn't
look like they were even bothering to try to modify the output;  I
think they were just entranced by the narcistic effect of seeing their
own brainwaves.
   Their literature suggests a low-end setup with a one-channel IBVA
sensor would run fine on a Mac LC or PowerBook 100 with 2MB of RAM,
and would cost you $995.  To add the second sensor they suggest moving
up to a Mac IIci or PowerBook 170 with 4MB of RAM; the two-channel
system would set you back $1640.  (Obviously, neither of those prices
include the computer itself; add roughly $2k and $5k respectively if
you need to buy a Macintosh for the purpose.)


   The first system that I know of to market anything called (or
resembling) virtual reality to the public is in a trendy pier on Lake
Michigan in Chicago: BattleTech Center.  BattleTech Center has twelve
custom-built "cockpits" that simulate the interior of BattleTech
walking tanks straight out of the wargame of the same name, based on
the ubiquitous Japanese "mechwarrior" genre of animated cartoons. They
can run two battle simulations at a time; in each of them, two teams
of three mechs battle it out on a planet's surface with randomly
chosen conditions of visibility, light, weather, etc.
   The inside of your cockpit has a large "windshield" screen with
various heads-up displays projected on it; below it is a smaller radar
screen and all around you are the controls to the mech.  Unlike other
VRs, the BattleTech system doesn't even try to provide stereo optics.
Arguably, at the ranges and speeds involved, human stereo vision isn't
very reliable, anyway, so it shouldn't matter.  Maybe it doesn't,
after all.  But after spending about a half-hour in the cockpit in two
different game sessions, I can tell you that having only a forward
view gets very annoying;  I kept wanting to look from side to side.
   I would say that BattleTech makes a good video game, but it's not a
very convincing "artificial reality" simulation in that neither I nor
any of the players I spoke with had much of a sensation of actually
being there.  And BattleTech Center is expensive, too; I don't have my
price list handy anymore, but I remember it being somewhere around $10
to $20 per 15 minute game.

   Months later I got to see a demo of the new MicroCosm system from
virtual reality pioneers Jaron Lanier and VPL.  The MicroCosm is the
first system to offer stereo goggles and 3-D audio on a microcomputer
-- if you can call a Macintosh Quadra 900 with 8 MB RAM and a 160 MB
hard disk a micro.  The MicroCosm uses the joint processing power of
the 68040 in the Quadra, another CPU on a Nubus adapter card, and a
large scale array processor in the upright MicroCosm case to render
3-D video and audio on the fly while tracking the position sensors in
the goggles and in the powerglove.  The whole assembly (minus the
high-end Quadra 900) costs about $50,000.
   The VPL MicroCosm comes with nice software for building virtual
realities.  For creating objects, it uses the highly rated Swivel 3D
Professional software that is already quite popular as a 3D drawing
and rendering package on the Macintosh.  It offers good, easy to use
3D drawing tools and simplifies the design of moving, articulated
assemblies; putting jointed fingers on a hand or meshing teeth on
gears is easy in Swivel.
   Once objects are drawn in Swivel, you import them into another
piece of software that lets you specify their movements and how they
interact with each other or with the wearer. (One demo showed a
floating paintbrush that wearer could use to "paint" patterns on the
side of a large floating cube.)
   The interface for the world building software would seem very
familiar to an electrical engineer: reality "components" are shown as
blocks with various "pins" for their inputs and outputs; you drop them
onto the diagram and then draw lines to connect inputs to outputs.
   The 3-D wraparound goggles make for a much more "real" experience
than the BattleTech cockpits, but the system still falls very short of
convincing, because as soon as the universe gets at all complicated or
anything starts moving, the refresh rate drops noticably, and screen
updates get VERY jerky.  Jaron Lanier claims that their $250,000
professional system handles such situations much better.  And it,
unlike the MicroCosm, allows two people to enter the same reality and
interact with each other.

   A couple of months ago, a dance club in St. Louis called Atomix
changed its format to industrial music and its name to RAIL.  RAIL
made the papers lately when they installed their own public virtual
reality system, called Dactyl Nightmare from Virtuality.  A friend of
mine and I, who were already plannig on going out dancing, decided to
drop by RAIL, and while there try out Dactyl Nightmare.  According to
the papers, the Virtuality hardware invovled costs $60,000 per station
and RAIL has two of them connected together sharing the same reality;
they rent time on them for $4 per person per 4 minutes.
   Like BattleTech Center before it, Dactyl Nightmare is at heart a
violent video game.  Once they put on the goggles and headphones and
pick up the controller stick, the two players are transported onto a
multi-layer platform floating in space (with stars drifting in the
background, a nice touch).
   Hold your arm out in front of you, and you see a crudely drawn arm
sticking out, and the simple control stick has metamorphosed into
something resembling a pistol grenade launcher. The trigger fires a
short-range grenade (at low velocity, in big arcs, up to every 3
seconds); the button on top under your thumb moves you forward.  As
you turn your body or just your head, the view shifts, of course.  If
you crouch down, your viewpoint drops -- and the other player sees you
crouched down, with your head tilted or turned appropriately.  When
the other player is moving, their game image looks like it's running.
   Flying in circles overhead is a big green pterodactyl; let it get
too close and it will grab you, carry you up way over the playing
field, and drop you to your death.  The object of the game is to
repeatedly hunt each other down, while fending off the 'dactyl.  Each
time you get killed, you rematerialize elsewhere on the platform.
   Maybe it has something to do with being hunted, and maybe it has a
lot to do with the hallucinatory quality of the experience, but those
four minutes seem like a VERY long time.  In fact, while I never
really lost the feeling of playing a video game, I got a lot of the
same adrenaline rush I get from playing paintball; when I was shot at
(or picked up by the 'dactyl) I kept expecting it to hurt.
   Pam was much more strongly affected.  She asked me when we got out
if those machines used strong electrical fields to track your position
(they don't), because as soon as she stuck her arm out, it started to
tingle and then got numb, and by the end of the four minutes her whole
body was tingling and starting to go numb.  We talked it out for a
long time, and I think I know what happened: she put her hand out and
saw something that didn't look at all like her hand; looked down and
saw something that didn't look at all like her body; held out a stick
and saw a huge gun.  The dissociation freaked her out and was moving
towards completely paralyzing her.
   At one point early on, she drifted to the edge of a platform and
looked down at the platform below trying to find the way down -- and
then started screaming because she felt exactly like she was looking
down off of a cliff and she couldn't remember how to back away from
the edge; she thought she was going to fall.  (You can't back up; you
have to turn your body in the desired direction and then go forward.)
It was definitely real enough to mess with her mind.
   But we both had complaints about some things that were entirely
unrealistic.  When you run, the viewpoint is rock solid stable,
entirely unlike what real humans see when they walk or run.  Even
worse was going down or up the stairs, where the movement was so
smooth is was more like flying than running.  Unlike the MicroCosm,
Dactyl Nightmare doesn't even attempt 3-D sound; when the 'dactyl
screams or another player shoots at you, your ears give you no clue
which way to turn.  And if you turn your head at all quickly, the
display becomes very jerky.  (Too its credit, as hinted above, it
gives very good depth vision, though.)

   Having "test driven" BattleTech Center, the MicroCosm, and Dactyl
Nightmare, I think that contrary to most of the hype you've heard so
far, virtual reality has a very, very long way to go.  Unless you can
afford as much CPU power as a multimillion dollar aircraft simulator,
the sensory inputs just can't keep up well enough to fool the mind.
But based on Pam's experience with Dactyl Nightmare, when it does, it
will prove a very disorienting experience indeed.

TUCoPS is optimized to look best in Firefox® on a widescreen monitor (1440x900 or better).
Site design & layout copyright © 1986-2015 AOH