Week 8
For a number of reasons, DMP has allowed me to "extend" my internship for an additional week of correspondence research with Vicki and Brian. These reasons include: the head-mounted display being out to be fixed for 7 of the 8 weeks of my stay in Minneapolis; the physiological monitoring equipment not arriving until after my stay; IRB not giving approval for the huamn-subject research until after my stay; and interruption of my research for a week by attending Siggraph 08 (which was wonderful and not regretted in the least!). As it was, in my short stay we never would've been able to run many if any participants, even given working equipment and approval.
Initially, we supposed we could submit a paper by late September to IEEE VR(9/29 was the submission date). However, IRB approval didn't come until Sept. 22, which was insufficient time for running a significant number of participants, let alone analyzing the data and writing a good paper. And so, the paper submission was pushed off until December to allow for better research.
And that brings us to here. In September, Vicki contacted me telling me they'd run a few participants and found the data noisier than expected. She asked me to look into how similar papers interpretted the EKG data and verify that the IBI (interbeat interval) is the signal used. She also asked how to reduce the noise to see the salient spikes in HR.
So, I spent a couple days lookingover the literature for more details on interpretation. Most were vague, but
I found some information. Meehan et al, in a paper similar to ours, measured
ΔHR = MeanHR(Pit Room) - MeanHR(Training Room)
where Training Room refers to a virtual room in which the participant gets acclimated to the VR and HMD and Pit
Room refers to a virtual room with the middle of the floor removed, showing down to a room below. They
noted that there is a lag of 2-3 seconds from tme of stimulus to onset of reaction for all physiological
reaction [Andreassi, 1995]. Also, heart rate is affected by respiration; for accurate measurement, HR should
be averaged over one or two respiration cycles (which generally last about 4 seconds) [Seidel, 1995]. Finally,
they corrected for loss of balance by recording the times and going in to remove these points.
I also found a paper by Robert Stoermer, et al. ("Monitoring Human-Virtual Reality Interaction: A Time Series Analysis Approach") which mentioned a possible explanation for each issue: statistical analysis of the cardiotachogram and QRS resolution for noise. Stroemer reported they first resolved the QRS complexes, a structure on the ECG that corresponds to the depolarization of the ventricles. After applying this smoothing of the curve, the single-channel ECG should reveal the IBI and the cardiotachogram parameters. Next, "The cardiotachogram was statistically evaluated, furthermore a spectral analysis was performed using a Fast Hartley transformation." The resulting signal should account for respiration and yield a result suitable for our analysis.
Concerning the noise, if it was not reduced by looking at the data in different ways or accounting for respiration, I found that certain software included a Source Consistency Filter, which reduces noise and baseline artifacts.
I am not certain what of this information has been used or how they have decided to look at the heart rate for these experiments. Hoever, this information gave Vicki and Brian more insights into the analysis of our data.
Finally, I provided my paper and sources of outside literature to Brian for help in writing the Introduction and related work components of our paper. I have remained available for correspondence, but have not been contacted for much additional help.
Works referenced
Meehan, M et al. Physiological measures of presence in stressful virtual environments. SIGGRAPH 2002. San Antonio, TX. 2002.
Seidel, Henrik, Hanspeter Herzel. Modelling heart rate variability due to respiration and baroreflex. In: Modelling the Dynamics of Biological Systems, edited by Erik Mosekilde and Ole Mouritse. Berlin: Sprigner-Verlag, 1995.
Stoermer, Robert, Ralph Mager, Andreas Roessler, Franz Mueller-Spahn, Alex H. BullingerMonitoring Human-Virtual Reality Interaction: A Time Series Analysis Approach. CyberPsychology and Behavior. June 2000, 3(3): 401-406.
Week 7
Alas, my stay in Minneapolis has drawn to a close. One exciting week away at Siggraph cut my stay down to 7 weeks total.
This week I adjusted the model twice. Initially, I added a table in the corner from which the participant would retrieve the virtualblock. However, due to the current configuration of the the Vicon system, this area of the room is poorly tracked. So, I made the bridge shorter and, at Vicki's recommendation, added a chair instead to hold the block as empty tables are more scarce. I also tested using the motion capture, and therefore finally aligned the virtual room with the real room accurately and fine tuned the picking up of the block. Now, the user can only pick up the block when they are reasonably close to it. I similarly added in controls to reset the block and return the block to the user, should they drop it and we wish to limit the variability between subjects.
Finally, Vicki and I decided to require the partcipants to look down into the pit by adding numbers on top of the targets which they have to read off. For this, I got to use rand again, which was kind of exciting (though not nearly as exciting as using clock).
And now, a slew of bad news: The HMD is still out for repairs. So in the 8 week span of my stay, we had it for one whole week. But don't worry, this was not the only thing keeping us form being able to perform even a complete trial run of the experiment! Additionally, the purchasing department lost Vicki's purchase request forms for the physiological monitoring devices, and we have yet to hear back from IRB. And so, as a result, not only do I miss the oppurtunity to help run these human subject experiments; I don't even get to try the whole thing out myself. However, I will be in correspondence and intend to aid in analysis.
And what kind of a final entry would this be without a picture:
Week 6
In all my excitement and planning for SIGGRAPH, I have neglected to post!
This week was a lot better driven. I finished up the basics of the block handling that can be tested without tracking. I figured out how to animate using the clock, which is all kinds of new and fun. I also realized I need to make some kind of table from which they will retrieve the aforementioned block, otherwise it will be too awkward. But that should be simple enough.
Vicki deicded to try getting IRB approval for testing by appending their standing one with the biofeedback information, rather than starting a whole new approval. I have not heard back yet, and it is unlikely we will get many if any human subjects in before I leave, but hopefully I can do some analysis from home. As far as the biofeedback, Vicki seems to have figured that all out form one of the recommended systems and I believe has placed an order. We may or may not receive the monitors in time for me to test them or investigate them or their related procedures, but we should have them shortly!
Week 5
This week has flown by! Also, I am finally posting on a Friday, not a Tuesday. I think this is a big deal.
Vicki has not gotten back to me about IRB (the human subject testing approval) or the status of purchasing biofeedback devices. So that's bad. Also we've had some lengthy tours of late, which makes working a little more difficult because of the computer on which I work.
I however have had to rethink use of auxillary hardware for performing the task of moving objects. I've been learning Ogre and reading through lots of bits of the code in order to make this task happen. It shouldn't be that hard, but I am having some trouble finding out how to get a particular bone (i.e. the right hand). The names are not apparent.
And finally, I have finished up the consent form and instructions to a point of current clarity. By this I mean, mostly for the instructions, that things may change in the methds or task depending on equipment, etc. However, they are otherwise sufficiently explained.
Week 4
The halfway mark!
The major accomplishment of the week was the model. The change in task/plan has changed the model needs, so I adjusted the vertigo model, and found the way to export it to a mesh recognizeable by Ogre. I then added a full floor mesh and added that to the model as well as a way to toggle between the two.
I've been filling out the IRB form to get permission for human experiements, and began revising the consent form and instructions. I still have a wee bit to do, but I've made a dent.
A couple things are looking less good. After talking to Vicki last Tuesday about the biofeedback equipment, I haven't heard any news. Hopefully I'll see her tomorrow and she'll have more news. The worse news is that, in light of a change over to computers for accounting, the HMD does not seem to have been sent out for its repair yet. In light of these, and the fact that IRB seems to take a while to grant approval, I am wondering if I'll be able to run the human experiments, or at least a sufficient number to write a paper. But perhaps I can do some analysis from home. I am optimistic, but concerned.
Week 3
Not the most productive of weeks. I got a little frustrated.
So, like I said, every journal I've read uses equipment sold only to physicians. I told Vikci, and hopefully she can get more leads or more direction. She is going to ask the University and someone at the med school about buying possibilities. Hopefully we will have ordered that soon.
However, today Vicki and I elucidated the experiment and its tasks so I have more to do, particularly more coding and graphics related work which appeases me. A lot more than I thought I'd have to do, really, which makes me so happy. Except that we're into week 4 right now, which is a tad scary. But I work best under pressure.
The last bit of business I've got to do is to apply for approval for human subject testing, which takes a while. Hopefuly it'll give me enough time to get a decent sized subject pool, but I'm concerned.
Week 2
At the start of the second week, I got a clearer picture of the tasks I needed to accomplish over my stay. I also launched the website, yay!
So Brian (Phd student) helped me open the model in SketchUp, and lo and behold, there was a satisfactory version of the model. I may do some tweaks to add to the fidelity of the transition from real to virtual. Brian loaded it into their program so it is now an explorable room in the HMD, which sadly went out for repairs this week as well.
I have been looking more into getting devices and software to monitor and record heartrate and skin response. It is a bit frustrating, because the papers I've read who did similar things used equipment which is sold only to physicians, and as yet we have no pysician overseeing our research. I am optimistic, but I really need to buy something soon so I can synchronize it with the program, and become familiar with the software.
On a personal note, I think I got bronchitis, so I went on some antibiotics, and feel the healthiest I've been since getting here! Also, I found out yesterday that DMP approved funding for me to go to Siggraph in LA for my 7th week. Very excited for that!
Week 1
The first week has been mostly reading. I've started with recent research done in the Digital Design lab, and am moving on to potential relevant scholarly papers. I have also started looking into getting a heart rate monitor and Galvanic skin response monitor that meet our needs for obtaining quantitative data.
Also, I got all suited up and motion captured, as well as took a spin in the VR HMD, while getting a crash course on running the Vicon software.