Alex Fiannaca 
Ph.D. Student & Software Engineer 
A technology aficionado with a passion for computer science, accessibility, and exploration!
About Me

Alex Fiannaca's Professional PhotoI’m currently a graduate student in Computer Science and Engineering at the University of Washington. I am co-advised by Maya Cakmak (Human-Robot Interaction) and Richard Ladner (Accessibility) and am currently researching a range of technologies in the accessibility domain (you can read more about these in the next section below). Additionally, I collaborate with Meredith Ringel Morris and the Microsoft Research (MSR) NExT Enable Team on communication technologies for people with severe motor impairments like ALS.

In Spring of 2016, I received my second M.S. in Computer Science and Engineering from the University of Washington. My master’s qualifying exam was based on research I performed at MSR in the summer of 2015. In this work, we attempted to alleviate many of the current communication issues in Augmentative and Alternative Communication (AAC) for people with ALS by redesigning an AAC system from a groupware perspective. This work is currently under submission.

In May of 2014, I graduated with my first M.S. in Computer Science and Engineering from the University of Nevada, Reno. My master’s thesis research surrounded the development of assistive technologies related to spatial perception for people with visual impairments. This involved two major projects:

  1. The first project dealt with tactile-proprioceptive displays which allow blind users to interact with large displays (published in the proceedings of Graphics Interface 2013). This work was evaluated with blind children at Camp Abilities, a summer camp for children with disabilities (Submitted to CHI 2015).
  2. The second project, known as Headlock, is related to the Navatar project being developed by Ilias Apostolopoulos. In my branch of this project, we are attempting to ease the navigational challenges of blind users when crossing large open spaces lacking tactile landmarks (i.e. spaces in which white canes are not particularly useful). This project was developed on Google Glass with the goal of creating a lightweight, unobtrusive mobile interface. A preliminary study of the Headlock application was published in the proceedings of ASSETS 2014.

Prior to my work in computer science, I earned a B.S. in Biochemistry and Molecular Biology (Magna Cum Laude) from UNR, and before that, I graduated Valedictorian from Spanish Springs High School in Sparks, Nevada.

Check Out My

Current Project

MapAll: Enabling Customizable Interactive Accessibility

Over the past several decades, the QWERTY keyboard and the optical (or less prevalent trackball) mouse have become accepted as the default input devices for desktop computing environments. While these standard forms of input are well suited to the average user, they can be particularly difficult to use, if not impossible, for a subset of our society which could greatly benefit from access to computing: persons with severe mobility impairments (Muscular Dystrophy, Friedreich’s Ataxia, Spinal Muscular Atrophy, etc). To address this issue, researchers have developed many different augmentative and alternative communication (AAC) systems relying on non-standard forms of input such as switchs, joysticks, and gaze trackers. Unfortunately, these are often targeted at creating accessible interactions custom tailored to the needs of a single user or a small subset of mobility impaired users sharing a common form of impairment. I am currently working to develop a new system which will allow users of all levels of ability to create personalized interactions suited to the input devices which work well for them individually by leveraging end user development (EUD). This system has the potential to serve as a test bed for AAC technology researchers and developers to both explore possible solutions and to distribute assistive technology directly to their primary stakeholders.

As a proof of concept, I have developed a program which makes drag and drop interfaces like the popular Scratch EUD environment accessible to users who cannot perform the “click-and drag” motion with a mouse. Click here for more information on this proof of concept!

Previous Projects



Traversing large open spaces is a challenging task for blind cane users, as such spaces are often […]

Haptic Target Acquisition

In this project we worked on developing non-visual displays which allow people who are blind to interact with large displays.


More Project Descriptions are on Their Way!
Get In




Snail Mail

  • Attn: Alex Fiannaca
  • Computer Science & Engr
  • Box 352350
  • Seattle, WA 98195-2350

Copyright © 2014 Alex Fiannaca