Wednesday, December 16, 2015

Final Paper

Andrew Jacobson and John Schmitz
Fall 2015
VIST 405

Virtual Reality Pediatric Pulmonary Function Tests

     Three and half months ago, when we were brainstorming ideas for our project, there was
one thing we both had in mind, virtual reality. Regardless of the medical direction we decided to
take, we wanted to incorporate virtual reality into our creation. The idea we chose to produce
was a virtual reality interactive experience that was designed for children required to perform
pulmonary function tests. These tests are required for all ages to examine lung performance in
terms of volume and air flow. The machines used are sometimes intimidating for young children,
however, which could cause issues during trials. Thus, we aimed to ease the process for this
demographic, making it both more enjoyable for the user and reliable for the tester.
We conceived our initial idea based on previous personal experience with our medical
subject. Growing up with chronic lung disease, Andrew had to undergo pulmonary function tests
often to check the progression of my illness. Many children with asthma or other lung related
issues have to perform these tests. The tests occasionally are equipped with a small animation
that appears as if it was created in the mid 1990s in Microsoft Paint to help incentivize children
to perform them (ex. “balloons” or “candles”). However, the tests have long passed their
effectiveness and need a massive update. Andrew’s mother, who is a pediatric physician, faces
this issue regularly; children are reluctant to perform reliable respiratory tests.
! We aimed to create a more realistic, immersive environment for children to help
incentivize them to provide more reliable statistics and be more willing to do them period. In
addition, we designed our interactive system to add some gamify the procedure to act as a form
of respiratory “training” or “therapy” to help produce greater results over time for all ages of
patients.

     Our initial research into the subject revealed some interesting findings. We first looked at
overview of how pulmonary function tests (PFT) are performed and operate, the challenges
involved and what happens after the tests. Aside from the information we discovered on the
process itself, we also learned that children are often not willing to put forth the effort to produce
reliable, consistent results (Seed). We found a study that was performed among roughly 100
children that found that children that used animations still had less consistent results over long
periods of time. The article we read on the study made it seem that it is the interactive game’s
fault for the results, however a separate study was performed that showed how universally the
performance often progresses past what is accountable for unknown reasons (Nystad). This
suggests that it is not the animations fault, but something completely irrelevant to the first study,
and it is only natural to have diminishing performance tests over time (Neale). The last study
involved two separate group of adults. One group did not do anything different than their
standard PFTs, while the other group used a game to help “train” for PFTs. The group that used
the game far outperformed the other group that only did PFTs; giving the notion that interactive
visualization actually promotes the improvement of lung capacity and function among patients.
There were other people that had conducted research into this idea, yet had not developed it fully
nor went the direction we are opting to take, but still showed some initial promise (Bingham).
Our objective was to create a virtual reality program using Unreal Development Kit that
was integrated with Oculus Rift and a custom-built electronic spirometer. We aimed to build two
demos this semester. Initially, we planned to have one consist of a sailboat that would be
propelled across a lake based on how the user performs. The other would simulate the big bad
wolf, blowing down a little pig’s house. We chose these two, since they were fairly gender
neutral in design. However, we changed our mind on the latter, since it proved to difficult to
accomplish within the time frame allotted. Instead, we decided to update the cake level that was
one of the original levels. Some of the other levels that were discussed as potential options
included: 1) a rocket level where the player had to perform better to get farther into space and 2)
a level where the player rode a horse and had to blow into the spirometer to leap over hurdles.
We produced prototypes for all of these concepts, except the “little pigs” level. These levels
might be further explored and or developed later.

     The spirometer was created using an Arduino board and a compatible bread board, along
with the following materials: PVC pipe, plastic tubing, glue, a Honeywell ASDX Ultra-low
Silicon pressure sensor, a micro USB cable, and wires for circuitry purposes. The general design
of the electronic spirometer was based off of the design of an electronic spirometer of a PhD
biomedical engineer student at Vanderbilt University . The code we used was also a modification
of the same person’s code. In order to get the code to interact with Unreal Engine 4, we utilized a
free plugin that provided minimal, yet sufficient, communication between the two softwares.
We performed a proof of concept trial that tested the general concept. In the test, we had
the user blow into a paper bag that emulated the electronic spirometer, while wearing an Oculus
Rift that played videos timed with the action of blowing into the bag. The trials provided ample
results that helped change the direction of our project slightly, including what projects we
decided to ultimately develop.

     Following the testing, we completed our cost effective spirometer and continued the
improvement of our project. We first focused on the general performance of the games, and then
proceeded to progress the graphics of each. Unfortunately, the computers we worked on
throughout the semester could only handle medium quality lighting, but generally was able to
handle the high input frequency of the Arduino. That being said, one of our biggest issues we
faced was the crashing of the program, which I believe was related to a potential memory leak in
the Arduino code. We weren’t able to fully resolve the issue, but got it to a working condition.
Other issues we confronted included the learning curve we had to overcome involving both
Arduino and Unreal Engine 4, getting the plugin to operate correctly and resolving logic related
code issues.

     During our initial presentation, our programs worked effectively as predicted, minus a
few crashes that we experienced. At the VIST show, however, we faced unforeseen issues. Our
values that we had programmed to affect at what levels the candles would blow out started not
registering. It took some time to resolve the issue, which involved setting the numbers at much
lower levels. After that was fixed, however, the overall reception regarding our project was great.
Users appeared to thoroughly enjoy using our product. They seemed to be more enthused by the
candle demo over the sailboat demo, since there wasn’t an observable goal in the latter. Other
criticisms/recommendations involved there being a lack of birthday related objects in the “cake
scene” and creating markers to judge the distance traveled in the sail boat level. People were able
to infer what they were supposed to do fairly easily with little instructions given, which is
beneficial for mass installation purposes. That being said, knowing the issues with our current
configuration, the next step would be to determine where our development goes from here.

     Since Andrew plans on continuing the research and development of this project, he
believes in modifying the demos we currently posses into more finalized formats. Following the
completion of those two, he will perhaps create another demo for testing purposes, maybe even
one of the demos we chose not to develop listed above. There is also the discussion of going to
Startup Aggieland in order to pursue a business related venture. Hopefully, in doing so, we would
be able to be placed in contact with individuals who have more expertise in the medical or
biomedical engineering field. Regardless, the development of our project went sufficiently well
enough, considering the amount of technical issues we faced throughout the process. We believe
that our concept has a lot of potential and can be carried to a more finalized development point,
which we hope is possible to achieve in the coming months or years.

Work Cited
Bingham, Peter M., Thomas Lahiri, and Taka Ashikaga. "Pilot Trial Of Spirometer Games For
Airway Clearance Practice In Cystic Fibrosis." Respiratory Care57.8 (2012): 1278-1284
7p. CINAHL Complete. Web. 16 Dec. 2015.
Neale, A V, and R Y Demers. "Significance Of The Inability To Reproduce Pulmonary
Function Test Results." Journal Of Occupational Medicine.: Official Publication Of
The Industrial Medical Association 36.6 (1994): 660-666. MEDLINE Complete. Web.
16 Dec. 2015.
Nystad, W., et al. "Feasibility Of Measuring Lung Function In Preschool Children."Thorax 57.12
(2002): 1021-1027. Academic Search Alumni Edition. Web. 16 Dec. 2015.
Seed, M.Sc., RCPT(p), Laura, Allan Coates, M.D., Susan Carpenter, RN, and Jennifer Leaist,
RN, BScN. "Pulmonary Function Testing." Aboutkidshealth. Sick Kids, 17 Mar. 2010.
Web.

Final project pictures






Documentation video

Documentation video

Thursday, November 19, 2015

level asset highlighting



Right now the scene is just 2 rockets. The other rocket needs to be a sailboat and I will add that in later tonight. This is the opening scene the player will always start in. When the player moves his or her head, depending on which object they are staring at, that object will highlight. When the player pushes a button on the spirometer the highlighted object will be selected and start that corresponding game. Right now the scene is very basic but I will talk to andrew and see if we want to add grass and trees or something or leave it as-is

Thursday, November 12, 2015

Head tracking and actor highlight selection test



Highlighting an actor based on where the player is looking was a success. The next step will be to build the scene to look how we want with 3 select able objects that can be selected with a button on the arduino-spirometer once highlighted.

Monday, November 2, 2015

Head movement & object highlighting research and progress

Here are some examples and how-to's I've found for making objects in the default scene highlight when the Rift is rotated in their direction:

https://www.youtube.com/watch?v=j8gjvKyaQuU

https://wiki.unrealengine.com/Oculus_Rift_Blueprint

I think the Get Orientation commands will let us be able to highlight a specific object depending on where it is facing.

I will start testing this week


Monday, October 26, 2015

Horse scene progress

Here is the progress I've made to the horse scene.


The camera will follow the horse and the horse will walk along a preset path and jump over the fence when the user exhales properly. There is a grass texture and procedurally generated grass in the scene.

5 Demos

We plan on making 5 demos so that we can have a range of games to choose from. Some of these ideas were based on feedback from users during our preliminary testing. The 5 demo ideas are:

- A rocketship that shoots into the night sky as the user exhales
- A birthday cake full of candles that blow out as the user exhales
- A game where the user blows down a house in similar fashion to the big bad wolf
- A game where the user tried to blow a toy sailboat across a pond
- A game where a pony is running around a track and will jump over a hurdle if the user    
  exhales hard enough. 


There will be a main sandbox type level that will contain an object from each type of game that the user starts in every time the simulation is run. Upon walking into one of the objects, that objects corresponding game will run. 

Update on concept, materials, and schedule

Concept Design
Our objective is to create a virtual reality program using Unreal Development Kit 4 that is integrated with Oculus Rift and a custom-built electronic spirometer. With this program, we hope to better incentivize children with pulmonary issues to be more willing to perform reliable pulmonary function tests (PFTs).

We plan on creating two demos, if possible, this semester. One will consist of a sailboat that is propelled across a lake based on how the child performs.  The other will simulate the big bad wolf, blowing down a little pig’s house. We decided on these two, since they are fairly gender neutral in design.

The spirometer will be created using an Arduino board and a compatible bread board, along with the following materials.

Materials
-PVC pipe
-Plastic tubing
-Glue
-Pressure sensor (Honeywell brand?)
-Micro USB cable
-Wires

Schedule
Oct. 6: Interaction Design – paper prototype
Oct. 13: Project Proposal Presentation
Oct. 20: Blocked in first game
Oct. 27: Electronic Spirometer completed
Nov. 3: Spirometer works with program/Block-in 2nd scene
Nov: 10: Update graphics for both demos/Fine-tune Spirometer
Nov: 17: Continue finishing product (graphics, precise measurements)
Dec. 1: System Implementation Done
Dec. 8: Project Presentation
Dec. 10: VIST Show presentation
Dec. 15: Research paper and Documentation Video

Preliminary tests


We had users blow into long paper bags, and answer some questions about the experience in order to get feedback on our concept. Here were some of the questions we asked our users:

  1. This project is for 5-11 year olds. What do you think about that?

  1. What were your initial thoughts about the project?

  1. What did you think afterwards?

  1. Was it engaging?

  1. Did it make the experience better or worse?

  1. What did you think about the sailboat scene?

  1. What did you think about the airplane scene?

  1. What other environments would you like to see?

  1. What additional thoughts do you have, if any?


Tuesday, September 22, 2015

Idea Proposal

I will be working with Andrew on a 3D, possibly virtual, pulmonary function testing environment. I copied his write up for the idea below. He proposed the idea to me last night and I decided that the idea seemed really solid, so Andrew and I decided to morph our original idea into this:

Background Concept: 
Growing up with chronic lung disease, I had to undergo pulmonary function tests often to check the progression of my illness. Many children with asthma or other lung issues have to perform these tests. The tests occasionally are equipped with a small animation that appears as if it was created in the mid 1990s in Microsoft Paint to help incentivize children to perform them (ex. “balloons” or “candles”). The tests have long passed their effectiveness, however, and need a massive update. What may have worked in the past no longer does. My mom, who is a pediatric physician, faces this issue regularly; children are reluctant to perform reliable respiratory tests.
I aim to create a more realistic, immersive environment for children to help incentivize them to provide more reliable statistics and be more willing to do them period. In addition, I hope to gamify the system to act as a form of respiratory “training” or “therapy” to help produce greater results over time for all ages of patients.
Summary of Research: 
The first link is not an academic article, however it is website that focuses on the wellness and health of children. It gives an overview of how pulmonary function tests (PFT) are performed and operate, the challenges involved and what happens after the tests. Aside from the information that it provides on the process itself, it makes note that children are often not willing to put forth the effort to produce reliable, consistent results. The second link discusses a study that was performed among roughly 100 children that found that children that used animations still had less consistent results over long periods of time. The article makes it seem that it is the interactive game’s fault for the results, however in the third link, another study was performed that showed how universally the performance often progresses past what is accountable for unknown reasons. This suggests that it is not the animations fault, but something completely irrelevant to the first study, and it is only natural to have diminishing performance tests over time. The last link explains the results of another study that involved two separate group of adults. One group did not do anything different than their standard PFTs, while the other group used a game to help “train” for PFTs. The group that used the game far outperformed the other group that only did PFTs. This gives the notion that interactive visualization actually promotes the improvement of lung capacity and function among patients.
Sources :

3 research papers

treating GAD with virtual reality
Detecting anxiety using wearable monitoring
Effects of 3D motion on the brain

Tuesday, September 8, 2015

First project idea

Inspired by this scene in ratatouille and this game we want to make a relaxing interactive environment that responds to your heart rate and other bodily feedback to reveal more of the space to you as everything starts out black, then slowly starts to reveal a colorful light environment the more relaxed you get

Project ideas and example projects

meditative
something cool
relaxing
immersive interactive 3D visualizer
Oculus visualizer utilizing 3D

Wednesday, September 2, 2015

Introduction


I'm John. 

I'm from Rowlett, Tx.
I like games, movies and cartoons
and i'll probably never grow up.

I'm not sure what I want to do yet
 but I would really like to make games,
 rig models, or work for a creative agency. 

I enjoy programming and solving problems.