Thursday, April 28, 2011

1st day: 27th April, 2011

A significant day of our research project, as well as our entire university life. Three of us were looking forward to have a tremendous experience!

8.00am:
Entering to the J'pura University. Had breakfast from the Art faculty canteen, with a lot of uncertainty and a curious mind. Although there was a possibility of getting a negative results for the hypothesis, it was a pleasure to do a such kind of experiment.

8.45am:

There was a room, which uses for sound recording, therefore sound proof, and was ideal to carryout our experiments. It was booked early by Kasun with Chanaka Sir's approval. We managed to setup the room by 9.10am. Laptop computer with the prototype to be evaluated, USB keyboards, external sound card and 4 speakers are main components of the setup. Distance between 2 speakers were approximately 1 meter. In addition some boxes and dumb CPUs were used to keep the speakers. Furnished setup is shown in the picture.
There were two systems; with and without positional information, to be evaluated and compared. Thus 10 students are evaluated in 2 days. Approximate time for one student is 1 hour. Therefore only 5 students are evaluated in a day.


Participants of the day.
B - blind PB - partially blind F - female M - male


Positioned
Non Positioned
Nilmini (B) (F)
Harshani (PB) (F)
Kasun (PB) (M)
Chaminda (B) (M)
Aruna (PB) (M)


9.40am:
Nilmini  was ready to be our first test subject. She was born blind. Her computer literacy is moderate and has done a computer course sometime back.

Kanishka started to demonstrate the system with positional information. The session is being recorded and got some pictures with their approval.
At the very first time she identified that the page begins from the top left corner, before Kanishka explained it.
Another important factor observed from Nilmini is she turned her head according to the sound. She unreservedly looked at the exact position where the sound comes from. She identified the Personal info structure, as a table with 2 columns. Even though they can draw table structures in Braille, they never experienced structures such as tables and graphs in computer applications.As we observed even in the pilot studies, there is a slight difference of identifying up-down and left-right positions. Left-right can be understood more clearly compared to up-down dimension. Nilmini too indicated that there is a difference, but up-down also can be identified without any effort.
She was very excited about the application and asked us if this is imported from another country!

10.45am:
Harshani , a girl who is partially blind. she was also given the system with positional information and asked to perform several tasks. One of them were to find the 'Friend Summary' element on the home page. Below picture shows how she attempted to find that element.
She started from the top left corner of the page and navigate to right side. She didn't realize that there are only 3 elements, thus tried to go to the 4th element. Since notification sound played she realized that its the end of the page. Then she went down and found Friend Summary.
When she is asked to explain positions of the home page elements as she remember, her answer was she doesn't remember since this was used only one time. but it is useful if the logout button is always in a same place.
Harshani identified the table structures correctly and commented as extremely useful. Also she said it is clear and easy to understand. For the question on identifying up-down and left-right positions, she said it is difficult to identify up-down positions compared to left-right.

11.45am:
Kasun , he was the one who organized everything for us at J'pura University. Kasun is the president of the association for students with disabilities.

He was first given the non-positioned system and noted down his performs. Then gave the system with positional information. Even though he has identified the correct position, he had some problems when navigating to that position. He always tried to go to left side first (from the left corner element).

As I observed there was another issue, he tried to start navigation from the position where sound stopped. But auto read functionality just read the page without focusing on any element. At the beginning focus is on top left corner element, until the user move it using arrow keys. He is so much used to current system. Although he has identified the table structure, he has proposed us to label it as a table and read it out. He is not so good in listening. Reason can be he is partially blind.
However he is such a good organizer!

1.00pm-1.45pm: Lunch hour.

1.45pm:
Chaminda  is the preformed subject as we observed. He was also given both non-positioned and positioned systems. At very first time when home page was playing he is understood that the page has 9 elements in both x/y directions. In the Friend Search task he navigated correctly to the exact position straightaway, without any mistakes.

Table structure is identified correctly. He said it has two columns. Even though he has heard about tables structures in computer applications, he has never experienced it like this. He wanted to have this feature when drawing new tables, which they cannot do with current applications. 'Friend Summary', a comparison kind of table is also understood as 4 column table which has headings and data under those headings.
His comment for x/y dimensional identification was up-down not so difficult but left-right easier.

2.50pm:
Aruna, the last test subject of the day. He is also turned his according to the sound, as the 1st user. Even though they can identify the position by sound, it is difficult to navigate using arrow keys to that position. It's not because they cannot find the place, but they have forgotten what is the current focused element.
He has identified the table correctly and navigate to the Friend Summary, such as a non-blind person. He forgot the location first, was in the wrong place after listening for second time, resumed from the exact location. They also didn't know, but we will investigate.
Issue of identifying up-down and left-right is same. Up-down is difficult and left-right is easy.
His final comment was to develop this concept further and develop an application with sound positioning.

3.45pm:
Done for the day and getting ready to go home. Hope to conduct other 5 participants on Friday, 29th :)







No comments:

Post a Comment