Our lab includes two of the most advanced bicycling/pedestrian simulators in the world, which were completed in 2014 and 2016 with funding from two NSF Computing Research Infrastructure grants.
Each virtual environment can be configured as either a bicycling or a pedestrian simulator. Each simulator consists of three screens placed a right angles to each other. The front screen is 10 ft wide x 8 ft high. The two side screens are 14.22 ft wide x 8 ft high. Three DPI MVision 400 Cine 3D projectors are used to rear-project high-resolution, textured graphics in stereo onto the front and side screens (1920 x 1080 pixels on each screen). An identical projector is used to front project high-resolution stereo images onto the floor. The simulators include Optitrack Flex 13 passive optical tracking systems. This camera system provides high precision at 120 frames per second for both rigid body and full-skeletal motion capture. Participants wear active stereoscopic glasses that use radio frequency synchronization with the projectors in order to create a fully interactive, immersive, stereoscopic virtual environment.
When configured as a bicycling simulator, an instrumented bike mounted on a stationary rig is placed 5 ft. from the front and side screens. The bike is instrumented to record steering and speed, which are combined with virtual terrain information to render in real time the graphics corresponding to the rider’s movement through the virtual environment. The motion-tracking cameras track the rider's head-position in real-time, allowing us to render the correct visual perspective for the rider.
When configured as a pedestrian simulator, motion-tracking cameras track the pedestrian's head-position in real-time by triangulating the position of six IR head-trackers the participant wears. This location information is then fed to Motive: Body, a software program specifically designed to work in tandem with OptiTrack, which is capable of constructing full-body and custom skeletons for position tracking. The head position of the pedestrian can be used to calculate and update the geometry displayed on the four screens in real-time as the participant moves around, giving the illusion that the space extends into a virtual world beyond the walls.
The simulation software, developed by the investigators, controls motions of simulated vehicles to meet the specifications of experiment scenarios. Other software, including the DataVisualizer, permits coders to replay the participant movements (walking or riding) and automatically computes many of the variables of interest such as gap size chosen, timing of entry, crossing time, and time to spare when crossing intersections. We will also record gaze direction from the motion tracking of each rider’s head.
Other significant equipment in the laboratory: 1) two HtC Vive head-mounted virtual environment display systems, 2) an nVisor ST head-mounted virtual environment display system, 3) an Intersense IS-1200 VisTracker wide area 6-DOF inertial-optical tracker 4) an Ascension 3D Guidance trakSTAR magnetic tracker, and 5) a Lidar gun for measuring speeds and distances.