A UX designer seeks information/advice regarding IOS development
I am a UX designer seeking information/advice regarding IOS development. The question is both developmental and political. I work for a large retailer who's stores currently use an archaic product scanner called an RMU. It's interface employs a non-touch display and an extensive physical keypad. T开发者_如何转开发he RMU is so old it is no longer for sale.
The company thus seeks to rapidly develop a replacement employing an iTouch attached to a scanning sled. Rather than create a new experience employing the potential of the iTouch, the company's developers seek to emulate the RMU's interface exactly. That means the iTouch would display a menu of choices in precisely the same way as the RMU, which would look like this:
1 - press 1 for option A
2 - press 2 for option B
3 - press 3 for option C
X - press X for option D
Y - press Y for option E
etc.
The developers claim that it is technically impossible to enable touch for this menu. Thus users would have to invoke one of 5 virtual keypad menus to actually press 1,2,3,X,or Y as needed. This virtual keypad would of course cover the display menu.
I'm not an developer, but I can't believe that if the iTouch is pulling data from a backend system, displaying it on a touchscreen, capturing touch events, and associating those events with functions communicated back to the backend, that it can't capture the touch events at the same time that it displays the data, GIVEN that that is what the iTouch was created to do.
My suspicion is that the developers are salvaging work done for a previous attempt to emulate the RMU. In this endeavor, a Janam "Symbol" based on a palm devise (employing windows) was employed. The Janam (like the RMU) uses a display screen and physical keypad. Thus in reality, the iTouch would emulate the Janam which is itself emulating the RMU.
My concern is the terrible interface this will create, one that decouples display from functionality - in principle identical to a TV that can only change the channels when it is turned off. So at last my question. Am I just being paranoid, or are the developers gaming the project.
It is, of course, possible for the iTouch to display data and allow the user to touch it to do things. You only have to go as far as the built-in address book to see that.
It sounds very much like their iTouch application is basically a dumb terminal that is displaying output from some central server and sending user input strings back for processing by the central server. So yes, if they are using that model then it is technically impossible to make the text on the screen touchable; even scraping the text and parsing it into a menu for user-friendly display would go beyond the "dumb terminal" model.
But that is probably the stupidest and laziest possible design for the application.
If the application interface doesn't need to be changed often, the interface should be done in code in the application and just the data loaded from the central server.
If requirements are such that the interface does need to be loaded from the central server for some reason, the interface should be served up in some sort of structured format that the app can then display sanely. For example,
<menu>
<item>Option A</item>
<item>Option B</item>
<item>Option C</item>
<item>Option D</item>
<item>Option E</item>
</menu>
and then the app can format it for display with each menu item touchable.
At any rate, this doesn't sound like something with a technical solution. Whoever is in charge of this project needs to decide whether they want a crap interface that the end users will hate or one that makes sense for the technology being used. OTOH, this may have already happened.
精彩评论