US20110310041A1 - Testing a Touch-Input Program - Google Patents

Testing a Touch-Input Program Download PDF

Info

Publication number
US20110310041A1
US20110310041A1 US13/165,672 US201113165672A US2011310041A1 US 20110310041 A1 US20110310041 A1 US 20110310041A1 US 201113165672 A US201113165672 A US 201113165672A US 2011310041 A1 US2011310041 A1 US 2011310041A1
Authority
US
United States
Prior art keywords
touch input
application program
user interface
accessibility
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/165,672
Inventor
Joshua Matthew Williams
John D. Gale
Michael Edward Creasy
Matthew Even Dreisbach
Eric J. Albert
Phillip Roy Thompson
Christopher Brian Fleizach
Stephen Richard Lewallen
Mark H. Firth
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US13/165,672 priority Critical patent/US20110310041A1/en
Assigned to APPLE INC. reassignment APPLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ALBERT, ERIC J., LEWALLEN, STEPHEN RICHARD, FLEIZACH, CHRISTOPHER BRIAN, FIRTH, MARK H., THOMPSON, PHILLIP ROY, CREASY, MICHAEL EDWARD, DREISBACH, MATTHEW EVEN, GALE, JOHN D., WILLIAMS, JOSHUA MATTHEW
Publication of US20110310041A1 publication Critical patent/US20110310041A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management

Definitions

  • This subject matter is generally related to software development.
  • a software development process can include a structure for creating and maintaining a software product.
  • a software development process can include multiple stages. Some exemplary software development stages can include design, implementation, testing, and distribution. Some models of software development processes in existence today include waterfall model, spiral model, agile software development, extreme programming (XP), among others.
  • SDK software development kit
  • IDE integrated development environment
  • a testing mechanism queries the application program's user interface to identify a touch input that is to produce a specified result.
  • the testing mechanism then generates one or more signals simulating the touch input. These signals are input to the application program.
  • the application program responds accordingly, as if a user had actually performed the touch input. For example, the application program modifies its user interface based on the (simulated) touch input.
  • the testing mechanism then queries the application program's user interface again, to determine whether the user interface conforms to the specified result.
  • the application program can include a user interface element that accepts the touch input.
  • the user interface element can include an accessibility component that describes an accessibility attribute of the user interface element.
  • the one or more signals simulating the touch input can be provided at least in part based on the accessibility attribute described in the accessibility component of the user interface element.
  • a result can include an accessibility feature that corresponds to the accessibility attribute described in the accessibility component of the user interface element.
  • the accessibility feature employed is the same one that can be used to provide a voice readout of the application program's user interface.
  • FIG. 1 is a flowchart illustrating an exemplary process of testing an application program for a device.
  • FIG. 2 illustrates components of an exemplary system implementing automated testing of a touch-input program.
  • FIGS. 3A-3D illustrate exemplary techniques of testing various touch screen input programs.
  • FIGS. 4A and 4B illustrate exemplary techniques of testing programs accepting inputs other than touch screen input.
  • FIG. 5 illustrates an exemplary user interface of testing a touch-input program.
  • FIG. 6 is a flowchart illustrating an exemplary process of automatically testing a touch-input program.
  • FIG. 7 is a flowchart illustrating an exemplary process of testing a touch-input program in an exemplary testing environment.
  • FIG. 8 is a block diagram illustrating an exemplary device architecture of a mobile device implementing the features and operations described in reference to FIGS. 1-7 .
  • FIG. 9 is a block diagram of an exemplary system architecture for implementing the features and operations described in reference to FIGS. 1-8 .
  • FIG. 1 is a flowchart illustrating an exemplary process 100 of testing a program for a device.
  • a developer acquires (e.g., by downloading) an SDK of a platform, and develops an application program.
  • the application program is to be distributed (e.g., provided for download) to other users of devices compatible with the platform.
  • any person who engages in any part of developing the application program can be a developer.
  • the developer can develop ( 102 ) an application program that accepts touch inputs.
  • Developing the application program can include, for example, gathering requirements, designing the application program, writing source code of the application program, compiling the source code into binary code, and linking the binary code into executable code.
  • the touch inputs can include inputs to a touch input component (e.g., a touch-sensitive display, a touch pad, or a touch switch).
  • the touch screen input can include single-touch and multi-touch input. Additionally or alternatively, the touch inputs can include other motion inputs, e.g., shaking, rotating, or generally moving the device on which the application program executes.
  • the application program, or simply program, that is developed in process 100 can include any computer instructions that are configured to perform user tasks (e.g., tasks that produce results for a user) or system tasks (e.g., tasks that manage computing resources of a computer) or both.
  • the application program can be an application program based on a specified platform.
  • the platform can include libraries that provide functions and application programming interfaces (APIs) supporting the touch inputs.
  • APIs application programming interfaces
  • an API can enable the developer to utilize various features of a touch input component, or various features of motion sensors, accelerometers, angular rate sensors, and magnetometers.
  • the developer can test ( 104 ) the application program, for example, by executing and debugging the executable code.
  • the testing can be accomplished using one or more test scripts.
  • the test scripts can include one or more instructions written in a software language that test a part or all of the functionalities of the application program and determine whether the application program meets requirements guiding the design and development of the application program, and performs reliably and cleanly (e.g., without causing memory leaks).
  • the test scripts can test whether functionalities that accept touch inputs work correctly with corresponding hardware (e.g., a touch input component) of the device.
  • the testing can be conducted on the device, on a host computing device to which the device is tethered or wirelessly connected, or on a computer that executes an emulator of the device.
  • the developer can submit ( 106 ) the application program for review by a system or by a system developer (e.g., a developer responsible for the integrity of the platform).
  • Submitting the application program for review can include uploading the linked binary executable code of the application program and the test script to a server for automatic or manual review.
  • Submitting the application program for review can include submitting one or more test scripts.
  • a system can automatically execute the submitted test script and test functionality and performance of the submitted application program on one or more types of devices or device emulators.
  • the developer can receive ( 108 ) results of the review by the system. If the application program is not qualified or approved, a message can be sent to the developer.
  • the message can include a statement that the application program did not pass the review process, a list of one or more errors that occurred, and an explanation for each error.
  • the developer can redesign, reimplement, and retest the application program for submission again.
  • the application program can be distributed ( 120 ).
  • Distributing the application program can include storing the application program in a data store and providing the application program for download by other users (e.g., the general public).
  • FIG. 2 illustrates components of an exemplary system 200 implementing automated testing of a touch-input program.
  • the touch-input program includes a program that can accept a touch input, including an input from a touch input component (e.g., a touch-sensitive display, a touch pad, or a touch switch) or other motion input.
  • a touch input component e.g., a touch-sensitive display, a touch pad, or a touch switch
  • Exemplary system 200 can include testing device 202 and testing host 204 .
  • Testing device 202 can include a device on which application program 214 executes. Testing device 202 can include a variety of devices, both mobile and non-mobile. Mobile devices include, but are not limited to: a handheld computer, a personal digital assistant (PDA), a cellular telephone, an electronic tablet, a network appliance, a digital camera, a video camera, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, an email device, a game console, or a combination of any two or more of these devices. Testing device 202 can include one or more touch input components (e.g., a touch-sensitive screen, a touch pad, or a touch switch). Testing device 202 can alternatively or additionally include one or more motion or location sensing devices (e.g., an angular rate sensor, an accelerometer, a magnetometer, or a global positioning system (GPS) receiver).
  • GPS global positioning system
  • Application program 214 can include various functionalities that utilize the touch input components and motion or location sensing devices of testing device 202 .
  • Application program 214 can include a function that is to be invoked upon receiving input from a touch input component, a motion input, or a combination of both.
  • application program 214 can cause testing device 202 to play a sound or speak a word or phrase when a user taps a specified display button, flips testing device 202 upside down, or waves testing device 202 in a specified pattern.
  • Testing host 204 can be configured to conduct testing of application program 214 in an automatic manner, without actually requiring a user to touch the touch input component of testing device 202 or flip or wave testing device 202 .
  • Testing host 204 can include one or more computing devices.
  • Testing device 202 and testing host 204 can connect to one another through a connection cable or connection dock, or through a wired or wireless communications network.
  • testing device can be an emulator (e.g., a virtual device) that executes on testing host 204 or a computing device connected to testing host 204 .
  • Testing host 204 can host test script 206 .
  • Test script 206 can generate one or more signals simulating the touch input.
  • the signals simulating the touch input can be fed to testing device 202 as inputs to application program 214 .
  • test script 206 is stored on testing device 202 and executes on testing device 202 .
  • Application program 214 can execute in a testing environment where signals simulating the touch inputs replace inputs from various input devices and sensors of testing device 202 . For example, if an item is configured to be currently displayed on a touch-sensitive display screen of testing device 202 , a function of test script 206 can include generating a signal to the testing environment simulating a touch on the displayed item.
  • Test script 206 can include a series of functions that can send the signals.
  • the functions can be invoked at specified or user-configurable time intervals.
  • the intervals e.g., number of seconds
  • the response can include, for example, displaying a second item on the touch-sensitive display screen.
  • a second touch input can be a touch input on the second item.
  • the confirmation and instruction can be provided using functions supported from an accessibility framework.
  • application program 214 makes use of an accessibility framework.
  • the accessibility framework defines a generic object (called an “accessibility object”) that can represent any type of user interface object, such as a window, a control, and even an application program itself.
  • Accessibility objects enable an application to provide information about its user interface and capabilities (e.g., available functions) in a standard manner.
  • accessibility objects provide a uniform representation of an application's user interface elements, regardless of the application framework on which the application depends.
  • the accessibility framework represents an application's user interface as a hierarchy of accessibility objects.
  • An accessibility object provides information about the user interface object that it represents. This information includes, for example, the object's position in the accessibility hierarchy, the object's position on the display, details about what the object is, and what actions the object can perform.
  • Accessibility objects can be communicated with via a particular application programming interface (the “Accessibility API”).
  • An accessibility object within an application can perform actions, such as actions that correspond to user input like a touch input.
  • Each accessibility object includes information about which actions it supports, if any. For example, the accessibility object representing a button supports the “press” action and sends a request to the actual button user interface element to perform the action.
  • An accessibility object has attributes associated with it.
  • the number and kind of attributes vary depending on the type of user interface object the accessibility object represents. Attributes have values that can be used to find out about the user interface object. For example, the value of an accessibility object's role attribute indicates what type of user interface object that accessibility object represents.
  • an action is an attribute of an accessibility object.
  • the accessibility protocol supports actions differently from the way it supports attributes, so actions and attributes are often described separately.
  • An accessibility object can support one or more actions.
  • An action describes how a user interacts with the user interface object that the accessibility object represents.
  • the accessibility framework defines seven actions that an accessibility object can support: press (a button), increment (the value of a scroller or slider indicator), decrement (the value of a scroller or slider indicator), confirm (the selection of an object), cancel (a selection or an action), raise (a window), and show menu (display a contextual menu associated with an object).
  • a message can be sent to an accessibility object.
  • a first type of message requests information from the accessibility object.
  • the accessibility object responds to this type of message by returning attribute values.
  • a second type of message requests the performance of an action.
  • the accessibility object responds to this type of message by performing the action.
  • the various user interface elements and capabilities of application program 214 are represented by accessibility objects.
  • Test script 206 interacts with the user interface of application program 214 via the Accessibility API.
  • the Accessibility API enables communication with the accessibility objects that represent the user interface elements of application program 214 .
  • test script 206 can generate one or more signals simulating a touch input. These signals are sent as messages, via the Accessibility API, to accessibility objects that represent user interface elements of application program 214 . The messages cause the accessibility objects to perform various actions, thereby sending requests to the actual user interface elements to perform the actions and simulating the user input. Test script 206 can also use messages to obtain information about the state of the user interface of application program 214 .
  • the testing device may support more general APIs for simulating touch events at a system level. For example, a button's accessibility object might accept a “press” action, which tells the button to respond as if it were pressed. This approach sends the action directly to the desired user interface object, but bypasses any OS and application components devoted to routing real user events.
  • the testing mechanism described here can invoke either the system-level input simulation APIs or the Accessibility action APIs as needed, to best simulate the actions specified in the test script.
  • Testing host 204 can host event recorder 208 .
  • Event recorder 208 can be configured to receive various outputs of test script 206 .
  • the outputs of test script 206 can be submitted to the recorder 208 .
  • the recorder can record events submitted from testing device 202 .
  • the events can include content of the outputs (e.g., a screen shot, an audio recording, or a notification of a vibration) and metadata.
  • the metadata can include, for example, a time that the outputs occurred, a memory status (e.g., a call stack) at the time the outputs occurred, variable values of application program 214 , and other information related to a status of application program 214 and testing device 202 .
  • Events recorded by event recorder 208 can be stored in event data store 210 .
  • Event analyzer 212 can retrieve data stored in event data store 210 and replay the events.
  • the replay can include presenting various events in a user interface, including screen shots, audio outputs, and physical action of testing device 202 .
  • the physical action can be represented using an audio or visual representation.
  • the replay can include displaying various memory states along a timeline.
  • FIGS. 3A-3D illustrate exemplary techniques of testing various touch screen input programs. Similar techniques apply to testing programs that use different touch input components, such as a touch pad.
  • FIG. 3A illustrates exemplary techniques of testing an application program accepting touch input from a user interface element that supports accessibility functions. Accessibility functions can include functions that provide assistance (e.g., enlarged fonts and/or voice readouts) to people with disabilities (e.g., the visually impaired) for accessing the application program.
  • assistance e.g., enlarged fonts and/or voice readouts
  • people with disabilities e.g., the visually impaired
  • the application program “My App” can execute on mobile device 300 a.
  • a graphical user interface of the application program can be displayed on touch-sensitive screen 302 a of mobile device 300 a.
  • the graphical user interface can include window 304 .
  • Window 304 can include virtual button 306 “Configure Network” that accepts a touch input (e.g., a tap).
  • Virtual button 306 can be associated with accessibility features.
  • test script 206 can be utilized to test the functions of virtual button 306 and accessibility features associated with virtual button 306 .
  • the test script can use one or more of the following exemplary functions:
  • the exemplary functions can be associated with various display elements (e.g., virtual button 306 ).
  • the display elements can be identified by one or more display element identifiers.
  • the test script can include a function that returns a main display window, use an element ( ) function to retrieve one or more display elements of the main display window, and identify virtual button 306 from the retrieved elements by name, identifier, or class membership.
  • the test script can be utilized to test accessibility functions of virtual button 306 .
  • virtual button 306 can be configured to invoke dialog balloon 308 upon receiving a single touch.
  • Dialog balloon 308 can include enlarged text of virtual button 306 as an aid to visually impaired users.
  • virtual button 306 can be configured to invoke a voice over that includes speech (e.g., synthesized or pre-recorded voice) or a Braille output describing the virtual button 306 upon receiving the single touch.
  • the test script can include functions for capturing dialog balloon 308 (e.g., through a screenshot), the voice over (e.g., through recording), or the Braille output for testing the accessibility functions.
  • FIG. 3B illustrates exemplary techniques of testing an application program accepting touch screen inputs that includes a drag-and-drop.
  • the application program “My App” can execute on mobile device 300 b.
  • the application program is configured to display a graphical user interface in which a user can drag-and-drop a display element 310 by tapping (and holding) on display element 310 using a touching means (finger or stylus); while maintaining contact with display element 310 on a display screen 302 b, moving the touching means into a display area 314 ; and releasing display element 310 .
  • display element 310 can be displayed in display area 314 .
  • a test script (e.g., test script 206 ) can be utilized to test the drag-and-drop functions.
  • the test script can use one or more of the following exemplary functions: p 1 dragInsideWithOptions (options) or dragInsideWithOptions (object, options) or dragInsideWithOptions (location, options).
  • the draglnsideWithOptions function can be used to generate one or more signals simulating the drag-and-drop input.
  • the options can include a dictionary that specifies characteristics of a gesture (e.g., a movement of a touching means on a touch-sensitive display screen).
  • the characteristics can include one or more of the following:
  • the test script can use offsets to achieve finer precision in specifying the hitpoint within a rectangle (rect) for the specified display element.
  • the offset can include a pair of values x and y, each ranging from 0.0 to 1.0.
  • the x and y values can represent, respectively, relative horizontal and vertical positions within the rectangle, with ⁇ x:0.0, y:0.0 ⁇ as the top left and ⁇ x:1.0, y:1.0 ⁇ as the bottom right.
  • ⁇ x:0.3, y:0.6 ⁇ can specify a position just below and to the left of a center; and ⁇ x:1.0, y:0.5 ⁇ can specify a position centered vertically at the far right.
  • test script can use one or more of the following exemplary functions:
  • the fromPointObject parameter can specify a rectangle or point from which the drag action is to begin.
  • the toPointObject parameter can specify a rectangle or point to which the drag action is to end.
  • the duration parameter can specify a length of time, in seconds, between starting and stopping of the drag-and-drop action.
  • the test script can use the function to simulate the drag-and-drop input that drags display element 310 and drops display element 310 into display area 314 .
  • a graphical user interface before the drag-and-drop and a graphical user interface after the drag-and-drop can be recorded.
  • FIG. 3C illustrates exemplary techniques of testing an application program accepting touch screen inputs that includes pinching and spreading.
  • the application program “My App” can execute on mobile device 300 c.
  • the application program is configured to display on touch-sensitive screen 302 c a graphical user interface that can accept a pinching input.
  • the pinching input can include a touch screen input where two or more touching means (fingers or styluses) are moving away from each other (pinch-open) or towards each other (pinch-close).
  • the pinch-open and pinch-close inputs can be configured to cause a display element to resize or zoom.
  • display element 326 can be resized to display element 324 upon receiving a pinch-open input.
  • display element 324 can be resized to display element 326 upon receiving a pinch-close input.
  • Other functions e.g., zoom out and zoom in
  • test script 206 can be utilized to test the pinching functions.
  • test script 206 can use one or more of the following exemplary functions:
  • a screenshot of the user interface before and after simulated signals of the pinch-open and pinch-close inputs can be recorded.
  • the screenshot can be analyzed to determine, for example, whether display element 324 or 326 has resized correctly.
  • FIG. 3D illustrates exemplary techniques of testing an application program accepting a flick input.
  • the application program “My App” can execute on mobile device 300 d.
  • the application program is configured to display on touch-sensitive screen 302 d a graphical user interface that can accept a flick input.
  • the flick input can include a touch screen input that includes a quick movement of a touching means (finger or stylus) to a direction.
  • the flick input can be configured to cause a display element to move to a next element.
  • display element 332 e.g., a first virtual page
  • the display element can move to the left, right, up, or down to reveal a next display element.
  • a menu item can be voiced over.
  • a flick input can cause a next menu item to be voiced over.
  • test script 206 can be utilized to test the flicking functions.
  • test script 206 can use one or more of the following exemplary functions:
  • the options can include a touchCount, a startOffset, and an endOffset, as described above, and other parameters.
  • the functions can generate one or more signals simulating the flick input.
  • a screenshot of the user interface before the signals simulating the flick input and a screenshot of the user interface after the signals simulating the flick input can be taken.
  • the screenshots can be analyzed to determine, for example, whether display element 332 has moved in accordance with a specified manner (e.g., according to design).
  • a voice recording can be made if the simulated signals of the flick input cause a new menu item to be voiced over.
  • An application program accepting a rotation gesture input can be tested.
  • the application program can execute on a mobile device.
  • the application program is configured to display on a touch-sensitive screen a graphical user interface that can accept a rotation gesture input (a type of touch-and-hold gesture).
  • the rotation gesture input can include a touch screen input that includes a rotating movement of one or more touching means (fingers or styluses) in a clockwise or counterclockwise direction (e.g., rotating a finger while it is pressing on the touch screen).
  • the rotation gesture input can be configured to cause a display element to rotate.
  • the rotation gesture input can be configured to cause a dial-like display element to turn to a new value.
  • test script 206 can be utilized to test the rotation gesture functions.
  • test script 206 can use one or more of the following exemplary functions:
  • the options parameter can include a dictionary that specifies characteristics of a rotation gesture (e.g., a movement of one or more touching means on a touch-sensitive display screen).
  • the characteristics can include one or more of the following:
  • the location parameter can include a point object at the center of the rotation gesture, with properties for x and y.
  • the relevant coordinates are screen-relative and are adjusted to account for device orientation.
  • a screenshot of the user interface before the signals simulating the rotate gesture input and a screenshot of the user interface after the signals simulating the rotate gesture input can be taken.
  • the screenshots can be analyzed to determine, for example, whether the display element has moved in accordance with a specified manner (e.g., according to design).
  • FIGS. 4A and 4B illustrate exemplary techniques of testing programs accepting inputs other than touch screen input.
  • FIG. 4A illustrates exemplary techniques of testing an application program accepting a shake input.
  • the application program “My App” can execute on mobile device 400 a.
  • the application program can be configured to provide a response to a motion input of mobile device 400 a.
  • the motion input e.g., a shake
  • the response can include, for example, a voice over of all input items displayed on a display screen, opening or closing a particular user interface, or turning on or turning off mobile device 400 a.
  • a test script (e.g., test script 206 ) can be utilized to test the application program in response to a shake input.
  • the test script can use a shake function, which can provide a signal simulating a shake event of mobile device 400 a.
  • the shake function can have parameters that can be used to specify a direction, magnitude, and duration of the shake. Screenshots of user interfaces before and after the simulated signals of the shake input can be recorded. A voice recording can be made if the simulated signals of the shake input cause a voice over.
  • FIG. 4B illustrates exemplary techniques of testing an application program accepting a change of orientation input.
  • the application program “My App” executes on mobile device 400 b and can be configured to provide a response to a change of orientation of mobile device 400 b.
  • An orientation of mobile device 400 b can include, for example, a portrait mode (e.g., upside up or upside down), a landscape mode (e.g., landscape left or landscape right), and a face mode (e.g., face up or face down).
  • the orientation can additionally or alternatively include an orientation of mobile device 400 b in a global reference frame.
  • the orientation can include a pointing direction or heading of an axis of the mobile device (e.g., “north west” or “bearing 315 degrees”).
  • the change of orientation can be detected using an accelerometer, an angular rate sensor, a motion sensor, a magnetometer, or a combination of the sensors.
  • the application program “My App” executing on mobile device 400 b can alternatively or additionally be configured to provide a response to a change of location of mobile device 400 b.
  • the change of location can be detected using a GPS device, a wireless triangulation device, a baseband processor, a magnetometer, or a combination of the sensors.
  • a test script e.g., test script 206
  • test script 206 can be utilized to test behavior of the application program in response to a change of orientation or change of location.
  • the test script can use one or more of the following exemplary functions:
  • the functions can generate one or more signals simulating a change in orientation or location. Screenshots of user interfaces before and after the simulated signals of the change of orientation or change of location can be recorded. A recording can be made if the simulated signals of the change of orientation or change of location input cause a voice over. A change of user interfaces (e.g., a change of a map display or a change of a compass dial display) can be compared against the simulated change of orientation or change of location to determine whether the application program “My App” behaves correctly.
  • a change of user interfaces e.g., a change of a map display or a change of a compass dial display
  • FIG. 5 illustrates an exemplary user interface 500 of testing a touch-input program.
  • Exemplary user interface 500 can be displayed on a display device coupled to a testing host (e.g., testing host 204 of FIG. 2 ) or a testing device (e.g., testing device 202 of FIG. 2 ).
  • a testing host e.g., testing host 204 of FIG. 2
  • a testing device e.g., testing device 202 of FIG. 2 .
  • Instruments section 502 can present for display various testing projects (e.g., groups of scripts, application programs, and associated devices). In the example given, a test of a touch-input program to be executed on example device “Device 1 ” is being conducted.
  • testing projects e.g., groups of scripts, application programs, and associated devices.
  • User interface 500 can include script control section 504 .
  • Script control section 504 can include control elements that enable a user to select one or more automated test scripts (e.g., test script 206 of FIG. 2 ).
  • the test scripts can be written using various programming languages (e.g., JavaScript).
  • Script control section 504 can include controls for opening one or more editing windows of the test scripts.
  • User interface 500 can include screenshot control section 506 .
  • Screenshot control section 506 can include controls that can configure a time of taking screenshots. The screenshots can be taken and recorded upon request or at predetermined screenshot time intervals or at screenshot points specified in the test script.
  • Time control sections 508 a and 508 b can include controls for configuring an inspection time range for the testing.
  • the inspection time range can specify a beginning time and an end time for inspecting behavior of the application program being tested.
  • the beginning time and end time can be offsets from the beginning of the application program execution.
  • User interface 500 can include script log section 510 .
  • Script log section 510 can include a set of events triggered by the test script. The set of events can be organized by time. A selection of a particular event can cause data related to the event to be displayed in event display section 512 .
  • Event display section 512 can display one or more events selected by the user.
  • the event can be displayed in association with event data.
  • the event data can include stack data, variable data (e.g., values of variables at time of the event), and memory status.
  • the event data can include one or more screenshots (e.g., screenshot 514 ) of the event.
  • the screenshot can include a display element (e.g., display element 324 of FIG. 3C ) before a pinch-close input is applied to the display element.
  • the event data can include an audio clip (e.g., a voice over) associated with an event.
  • an application program that is configured to receive a touch input can be tested in an automated manner.
  • a testing mechanism queries the application program's user interface (e.g., by sending messages to accessibility objects using the Accessibility API) to identify a touch input that is to produce a specified result.
  • the testing mechanism then generates one or more signals simulating the touch input. These signals are input to the application program (e.g., by sending messages via the Accessibility API to various accessibility objects).
  • the application program responds accordingly, as if a user had actually performed the touch input. For example, the application program modifies its user interface based on the (simulated) touch input.
  • the testing mechanism queries the application program's user interface again, to determine whether the user interface conforms to the specified result.
  • FIG. 6 is a flowchart illustrating exemplary process 600 of automatically testing a touch-input application program.
  • process 600 will be described in reference to a system implementing process 600 .
  • the system can include a device that executes the touch-input program, or a computer connected to the device, or both.
  • the system can interrogate ( 602 ) the user interface of the application program to identify a touch input operable for producing a specified result in the application program.
  • the application program can include a user interface element that accepts a touch input.
  • the touch input includes a gesture of one or more touching means (fingers or styluses) on a display screen of the device.
  • the user interface element can include one or more accessibility components (e.g., accessibility objects) that describe one or more accessibility attributes of the user interface element. The values of these attributes provide information about the user interface object and can be queried by the system.
  • the system can generate ( 604 ) one or more signals simulating the touch input.
  • the one or more signals simulating the touch input can be provided at least in part based on the accessibility attribute described in the accessibility component of the user interface element.
  • the signals can correspond to the tap or flick that is to elicit the specified result.
  • the system interrogates ( 606 ) the user interface again, checking for the expected changes.
  • the system can also receive events logged by the script (e.g., failure to simulate the touch input because the specified user interface element does not exist or an explicit script verification of one or more facets of the current user interface state) (not shown).
  • An event can include an accessibility feature that corresponds to the accessibility attribute described in the accessibility component of the user interface element.
  • the system can determine ( 608 ) that the actual result conforms to the specified result. Determining that the actual result conforms to the specified result can include comparing screenshots of the actual result with design specifications of the touch-input program.
  • the system can provide a library of signals simulating various touch inputs.
  • a test script for testing the touch-input application program can include serial or parallel calls to the library to generate one or more signals simulating sequential or concurrent touch inputs.
  • FIG. 7 is a flowchart illustrating an exemplary process of testing a touch-input program in an exemplary testing environment.
  • a testing host can receive ( 702 ) a first event.
  • the first event can include the test script logging a simulated touch input to an application program executing on a device.
  • the device can be a device connected to the testing host through a communications network (e.g., a local area network (LAN) or a wide area network (WAN)).
  • the first event can be generated from a test script.
  • the simulated touch input for the application program can be produced by the testing script, or by a testing tool kit in response to an execution of the testing script.
  • the simulated touch input can include a single touch input, a multi touch input, a gesture, or a physical movement of the device.
  • the simulated user input is associated with a timestamp.
  • the testing host can receive ( 704 ) a second event from the test script, verifying that the application program user interface is in an expected state following the touch input.
  • the event can include a screenshot of the application program.
  • the testing host can acquire ( 706 ) a status of the device.
  • the status can correspond to the response to the simulated touch input.
  • the status can include a call stack, a memory status, including memory failure status, or a memory leak status.
  • the testing host can provide ( 708 ) for display a monitoring interface.
  • Providing for display the monitoring interface includes providing for display, based on a timestamp, the visual representation of the first event, the visual representation of the second event, and visual representations of other events in a timeline.
  • FIG. 8 is a block diagram illustrating exemplary device architecture 800 of a mobile device implementing the features and operations described in reference to FIGS. 1-7 .
  • the mobile device can be, for example, a handheld computer, a personal digital assistant, a cellular telephone, an electronic tablet, a network appliance, a camera, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a network base station, a media player, a navigation device, an email device, a game console, or a combination of any two or more of these data processing devices or other data processing devices.
  • EGPS enhanced general packet radio service
  • the mobile device can include memory interface 802 , one or more data processors, image processors and/or processors 804 , and peripherals interface 806 .
  • Memory interface 802 , one or more processors 804 and/or peripherals interface 806 can be separate components or can be integrated in one or more integrated circuits.
  • Processors 804 can include one or more application processors (APs) and one or more baseband processors (BPs). The application processors and baseband processors can be integrated in one single process chip.
  • the various components in the mobile device for example, can be coupled by one or more communication buses or signal lines.
  • Sensors, devices, and subsystems can be coupled to peripherals interface 806 to facilitate multiple functionalities.
  • motion sensor 810 , light sensor 812 , and proximity sensor 814 can be coupled to peripherals interface 806 to facilitate orientation, lighting, and proximity functions of the mobile device.
  • Location processor 815 e.g., GPS receiver
  • Electronic magnetometer 816 e.g., an integrated circuit chip
  • Accelerometer 817 can also be connected to peripherals interface 806 to provide data that can be used to determine change of speed and direction of movement of the mobile device.
  • An angular rate sensor e.g., a Micro-Electro-Mechanical System (MEMS) gyro
  • MEMS Micro-Electro-Mechanical System
  • Camera subsystem 820 and an optical sensor 822 can be utilized to facilitate camera functions, such as recording photographs and video clips.
  • an optical sensor 822 e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips.
  • CCD charged coupled device
  • CMOS complementary metal-oxide semiconductor
  • Communication functions can be facilitated through one or more wireless communication subsystems 824 , which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters.
  • the specific design and implementation of the communication subsystem 824 can depend on the communication network(s) over which a mobile device is intended to operate.
  • a mobile device can include communication subsystems 824 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax network, and a Bluetooth network.
  • the wireless communication subsystems 824 can include hosting protocols such that the mobile device can be configured as a base station for other wireless devices.
  • Audio subsystem 826 can be coupled to a speaker 828 and a microphone 830 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
  • I/O subsystem 840 can include touch screen controller 842 and/or other input controller(s) 844 .
  • Touch-screen controller 842 can be coupled to a touch screen 846 or pad.
  • Touch screen 846 and touch screen controller 842 can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 846 .
  • Other input controller(s) 844 can be coupled to other input/control devices 848 , such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus.
  • the one or more buttons can include an up/down button for volume control of speaker 828 and/or microphone 830 .
  • a pressing of the button for a first duration may disengage a lock of the touch screen 846 ; and a pressing of the button for a second duration that is longer than the first duration may turn power to the mobile device on or off.
  • the user may be able to customize a functionality of one or more of the buttons.
  • the touch screen 846 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.
  • the mobile device can present recorded audio and/or video files, such as MP3, AAC, and MPEG files.
  • the mobile device can include the functionality of an MP3 player.
  • the mobile device may, therefore, include a pin connector that is compatible with the MP3 player.
  • Other input/output and control devices can also be used.
  • Memory interface 802 can be coupled to memory 850 .
  • Memory 850 can include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR).
  • Memory 850 can store operating system 852 , such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks.
  • Operating system 852 may include instructions for handling basic system services and for performing hardware dependent tasks.
  • operating system 852 can include a kernel (e.g., UNIX kernel).
  • Memory 850 may also store communication instructions 854 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers.
  • Memory 850 may include graphical user interface instructions 856 to facilitate graphic user interface processing; sensor processing instructions 858 to facilitate sensor-related processing and functions; phone instructions 860 to facilitate phone-related processes and functions; electronic messaging instructions 862 to facilitate electronic-messaging related processes and functions; web browsing instructions 864 to facilitate web browsing-related processes and functions; media processing instructions 866 to facilitate media processing-related processes and functions; GPS/Navigation instructions 868 to facilitate GPS and navigation-related processes and instructions; camera instructions 870 to facilitate camera-related processes and functions; magnetometer data 872 and calibration instructions 874 to facilitate magnetometer calibration.
  • the memory 850 may also store other software instructions (not shown), such as security instructions, web video instructions to facilitate web video-related processes and functions, and/or web shopping instructions to facilitate web shopping-related processes and functions.
  • the media processing instructions 866 are divided into audio processing instructions and video processing instructions to facilitate audio processing-related processes and functions and video processing-related processes and functions, respectively.
  • An activation record and International Mobile Equipment Identity (IMEI) or similar hardware identifier can also be stored in memory 850 .
  • Memory 850 can include testing instructions 876 that can perform one or more functions as described above.
  • Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules. Memory 850 can include additional instructions or fewer instructions. Furthermore, various functions of the mobile device may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
  • FIG. 9 is a block diagram of an exemplary system architecture 900 for implementing the features and operations described in reference to FIGS. 1-8 .
  • architecture 900 includes one or more processors 902 (e.g., dual-core Intel® Xeon® Processors), one or more output devices 904 (e.g., LCD), one or more network interfaces 906 , one or more input devices 908 (e.g., mouse, keyboard, touch-sensitive display, touch pad, touch switch) and one or more computer-readable mediums 912 (e.g., RAM, ROM, SDRAM, hard disk, optical disk, flash memory, etc.).
  • These components can exchange communications and data over one or more communication channels 910 (e.g., buses), which can utilize various hardware and software for facilitating the transfer of data and control signals between components.
  • communication channels 910 e.g., buses
  • computer-readable medium refers to any medium that participates in providing instructions to processor 902 for execution, including without limitation, non-volatile media (e.g., optical or magnetic disks), volatile media (e.g., memory) and transmission media.
  • Transmission media includes, without limitation, coaxial cables, copper wire and fiber optics.
  • Computer-readable medium 912 can further include operating system 914 (e.g., Mac OS® server, Windows® NT server), network communication module 916 , database interface 920 , test script 930 , event recorder 940 , event data store 950 , and event analyzer 960 , as described in reference to FIGS. 1-8 .
  • Operating system 914 can be multi-user, multiprocessing, multitasking, multithreading, real time, etc. Operating system 914 performs basic tasks, including but not limited to: recognizing input from and providing output to devices 908 , 904 ; keeping track of and managing files and directories on computer-readable mediums 912 (e.g., memory or a storage device); controlling peripheral devices; and managing traffic on the one or more communication channels 910 .
  • Network communications module 916 includes various components for establishing and maintaining network connections (e.g., software for implementing communication protocols, such as TCP/IP, HTTP, etc.).
  • Database interface 920 can include interface to various data stores such as event data store 950
  • Architecture 900 can be included in any device capable of hosting a database application program.
  • Architecture 900 can be implemented in a parallel processing or peer-to-peer infrastructure or on a single device with one or more processors.
  • Software can include multiple software components or can be a single body of code.
  • the described features can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device.
  • a computer program is a set of instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result.
  • a computer program can be written in any form of programming language (e.g., Objective-C, Java), including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors or cores, of any kind of computer.
  • a processor will receive instructions and data from a read-only memory or a random access memory or both.
  • the essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data.
  • a computer will also include, or be operatively coupled to communicate with, one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks.
  • Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
  • magnetic disks such as internal hard disks and removable disks
  • magneto-optical disks and CD-ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
  • ASICs application-specific integrated circuits
  • the features can be implemented on a computer having a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the user and a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer.
  • a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the user and a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer.
  • the features can be implemented in a computer system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them.
  • the components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks forming the Internet.
  • the computer system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

Abstract

Methods and systems are disclosed that allow automated testing of an application program that is configured to receive a touch input. A testing mechanism can be configured to identify the touch input that is designed to produce a specified result. The testing mechanism can generate one or more signals simulating the touch input. The testing mechanism can then check the state of the user interface of the application program being tested and determine whether the actual result conforms to the specified result.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Application No. 61/357,090, filed Jun. 21, 2010, which is hereby incorporated by reference herein in its entirety.
  • TECHNICAL FIELD
  • This subject matter is generally related to software development.
  • BACKGROUND
  • A software development process can include a structure for creating and maintaining a software product. A software development process can include multiple stages. Some exemplary software development stages can include design, implementation, testing, and distribution. Some models of software development processes in existence today include waterfall model, spiral model, agile software development, extreme programming (XP), among others.
  • Modern software development processes for various data processing systems allow for participation of a vast number of diverse developers for a platform. Tools for developing software for the platform can include a publicly available software development kit (SDK) and various rules and guidelines. The SDK can include various libraries and an integrated development environment (IDE). Using the SDK, a developer can develop an application program for the platform. The application program can be distributed to data processing systems that are compatible with the platform, for example, through an application store.
  • SUMMARY
  • Methods and systems are disclosed that allow automated testing of an application program that is configured to receive a touch input. A testing mechanism queries the application program's user interface to identify a touch input that is to produce a specified result. The testing mechanism then generates one or more signals simulating the touch input. These signals are input to the application program. The application program responds accordingly, as if a user had actually performed the touch input. For example, the application program modifies its user interface based on the (simulated) touch input. The testing mechanism then queries the application program's user interface again, to determine whether the user interface conforms to the specified result.
  • These and other embodiments can optionally include one or more of the following features. The application program can include a user interface element that accepts the touch input. The user interface element can include an accessibility component that describes an accessibility attribute of the user interface element. The one or more signals simulating the touch input can be provided at least in part based on the accessibility attribute described in the accessibility component of the user interface element. A result can include an accessibility feature that corresponds to the accessibility attribute described in the accessibility component of the user interface element. The accessibility feature employed is the same one that can be used to provide a voice readout of the application program's user interface.
  • The details of one or more embodiments of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
  • DESCRIPTION OF DRAWINGS
  • FIG. 1 is a flowchart illustrating an exemplary process of testing an application program for a device.
  • FIG. 2 illustrates components of an exemplary system implementing automated testing of a touch-input program.
  • FIGS. 3A-3D illustrate exemplary techniques of testing various touch screen input programs.
  • FIGS. 4A and 4B illustrate exemplary techniques of testing programs accepting inputs other than touch screen input.
  • FIG. 5 illustrates an exemplary user interface of testing a touch-input program.
  • FIG. 6 is a flowchart illustrating an exemplary process of automatically testing a touch-input program.
  • FIG. 7 is a flowchart illustrating an exemplary process of testing a touch-input program in an exemplary testing environment.
  • FIG. 8 is a block diagram illustrating an exemplary device architecture of a mobile device implementing the features and operations described in reference to FIGS. 1-7.
  • FIG. 9 is a block diagram of an exemplary system architecture for implementing the features and operations described in reference to FIGS. 1-8.
  • Like reference numbers and designations in the various drawings indicate like elements.
  • DETAILED DESCRIPTION Exemplary Software Development Process
  • FIG. 1 is a flowchart illustrating an exemplary process 100 of testing a program for a device. In exemplary process 100, a developer acquires (e.g., by downloading) an SDK of a platform, and develops an application program. The application program is to be distributed (e.g., provided for download) to other users of devices compatible with the platform. In this specification, any person who engages in any part of developing the application program can be a developer.
  • The developer can develop (102) an application program that accepts touch inputs. Developing the application program can include, for example, gathering requirements, designing the application program, writing source code of the application program, compiling the source code into binary code, and linking the binary code into executable code. The touch inputs can include inputs to a touch input component (e.g., a touch-sensitive display, a touch pad, or a touch switch). The touch screen input can include single-touch and multi-touch input. Additionally or alternatively, the touch inputs can include other motion inputs, e.g., shaking, rotating, or generally moving the device on which the application program executes.
  • The application program, or simply program, that is developed in process 100 can include any computer instructions that are configured to perform user tasks (e.g., tasks that produce results for a user) or system tasks (e.g., tasks that manage computing resources of a computer) or both. The application program can be an application program based on a specified platform. The platform can include libraries that provide functions and application programming interfaces (APIs) supporting the touch inputs. For example, an API can enable the developer to utilize various features of a touch input component, or various features of motion sensors, accelerometers, angular rate sensors, and magnetometers.
  • The developer can test (104) the application program, for example, by executing and debugging the executable code. The testing can be accomplished using one or more test scripts. The test scripts can include one or more instructions written in a software language that test a part or all of the functionalities of the application program and determine whether the application program meets requirements guiding the design and development of the application program, and performs reliably and cleanly (e.g., without causing memory leaks). In particular, the test scripts can test whether functionalities that accept touch inputs work correctly with corresponding hardware (e.g., a touch input component) of the device. The testing can be conducted on the device, on a host computing device to which the device is tethered or wirelessly connected, or on a computer that executes an emulator of the device.
  • Once the developer is satisfied with the testing, the developer can submit (106) the application program for review by a system or by a system developer (e.g., a developer responsible for the integrity of the platform). Submitting the application program for review can include uploading the linked binary executable code of the application program and the test script to a server for automatic or manual review. Submitting the application program for review can include submitting one or more test scripts. During the review, a system can automatically execute the submitted test script and test functionality and performance of the submitted application program on one or more types of devices or device emulators.
  • The developer can receive (108) results of the review by the system. If the application program is not qualified or approved, a message can be sent to the developer. The message can include a statement that the application program did not pass the review process, a list of one or more errors that occurred, and an explanation for each error. The developer can redesign, reimplement, and retest the application program for submission again.
  • Upon qualification and approval from the review, the application program can be distributed (120). Distributing the application program can include storing the application program in a data store and providing the application program for download by other users (e.g., the general public).
  • Exemplary System for Testing a Touch-Input Program
  • FIG. 2 illustrates components of an exemplary system 200 implementing automated testing of a touch-input program. The touch-input program includes a program that can accept a touch input, including an input from a touch input component (e.g., a touch-sensitive display, a touch pad, or a touch switch) or other motion input. Exemplary system 200 can include testing device 202 and testing host 204.
  • Testing device 202 can include a device on which application program 214 executes. Testing device 202 can include a variety of devices, both mobile and non-mobile. Mobile devices include, but are not limited to: a handheld computer, a personal digital assistant (PDA), a cellular telephone, an electronic tablet, a network appliance, a digital camera, a video camera, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, an email device, a game console, or a combination of any two or more of these devices. Testing device 202 can include one or more touch input components (e.g., a touch-sensitive screen, a touch pad, or a touch switch). Testing device 202 can alternatively or additionally include one or more motion or location sensing devices (e.g., an angular rate sensor, an accelerometer, a magnetometer, or a global positioning system (GPS) receiver).
  • Application program 214 can include various functionalities that utilize the touch input components and motion or location sensing devices of testing device 202. Application program 214 can include a function that is to be invoked upon receiving input from a touch input component, a motion input, or a combination of both. For example, application program 214 can cause testing device 202 to play a sound or speak a word or phrase when a user taps a specified display button, flips testing device 202 upside down, or waves testing device 202 in a specified pattern.
  • Testing host 204 can be configured to conduct testing of application program 214 in an automatic manner, without actually requiring a user to touch the touch input component of testing device 202 or flip or wave testing device 202. Testing host 204 can include one or more computing devices. Testing device 202 and testing host 204 can connect to one another through a connection cable or connection dock, or through a wired or wireless communications network. In some implementations, testing device can be an emulator (e.g., a virtual device) that executes on testing host 204 or a computing device connected to testing host 204.
  • Testing host 204 can host test script 206. Test script 206 can generate one or more signals simulating the touch input. The signals simulating the touch input can be fed to testing device 202 as inputs to application program 214. (In another embodiment, not shown, test script 206 is stored on testing device 202 and executes on testing device 202.) Application program 214 can execute in a testing environment where signals simulating the touch inputs replace inputs from various input devices and sensors of testing device 202. For example, if an item is configured to be currently displayed on a touch-sensitive display screen of testing device 202, a function of test script 206 can include generating a signal to the testing environment simulating a touch on the displayed item. Test script 206 can include a series of functions that can send the signals. The functions can be invoked at specified or user-configurable time intervals. The intervals (e.g., number of seconds) can be configured to allow application program 214 to have sufficient time to generate a response to a first touch input. The response can include, for example, displaying a second item on the touch-sensitive display screen. A second touch input can be a touch input on the second item. The confirmation and instruction can be provided using functions supported from an accessibility framework.
  • In one embodiment, application program 214 makes use of an accessibility framework. The accessibility framework defines a generic object (called an “accessibility object”) that can represent any type of user interface object, such as a window, a control, and even an application program itself. Accessibility objects enable an application to provide information about its user interface and capabilities (e.g., available functions) in a standard manner. Specifically, accessibility objects provide a uniform representation of an application's user interface elements, regardless of the application framework on which the application depends.
  • The accessibility framework represents an application's user interface as a hierarchy of accessibility objects. An accessibility object provides information about the user interface object that it represents. This information includes, for example, the object's position in the accessibility hierarchy, the object's position on the display, details about what the object is, and what actions the object can perform.
  • Accessibility objects can be communicated with via a particular application programming interface (the “Accessibility API”). An accessibility object within an application can perform actions, such as actions that correspond to user input like a touch input. Each accessibility object includes information about which actions it supports, if any. For example, the accessibility object representing a button supports the “press” action and sends a request to the actual button user interface element to perform the action.
  • An accessibility object has attributes associated with it. The number and kind of attributes vary depending on the type of user interface object the accessibility object represents. Attributes have values that can be used to find out about the user interface object. For example, the value of an accessibility object's role attribute indicates what type of user interface object that accessibility object represents.
  • Technically, an action is an attribute of an accessibility object. However, the accessibility protocol supports actions differently from the way it supports attributes, so actions and attributes are often described separately. An accessibility object can support one or more actions. An action describes how a user interacts with the user interface object that the accessibility object represents. In one embodiment, the accessibility framework defines seven actions that an accessibility object can support: press (a button), increment (the value of a scroller or slider indicator), decrement (the value of a scroller or slider indicator), confirm (the selection of an object), cancel (a selection or an action), raise (a window), and show menu (display a contextual menu associated with an object).
  • A message can be sent to an accessibility object. A first type of message requests information from the accessibility object. The accessibility object responds to this type of message by returning attribute values. A second type of message requests the performance of an action. The accessibility object responds to this type of message by performing the action.
  • In one embodiment, the various user interface elements and capabilities of application program 214 are represented by accessibility objects. Test script 206 interacts with the user interface of application program 214 via the Accessibility API. The Accessibility API enables communication with the accessibility objects that represent the user interface elements of application program 214.
  • Recall that test script 206 can generate one or more signals simulating a touch input. These signals are sent as messages, via the Accessibility API, to accessibility objects that represent user interface elements of application program 214. The messages cause the accessibility objects to perform various actions, thereby sending requests to the actual user interface elements to perform the actions and simulating the user input. Test script 206 can also use messages to obtain information about the state of the user interface of application program 214.
  • One embodiment of an accessibility framework is described in “Chapter 3: The Mac OS X Accessibility Protocol” within “Accessibility Overview”, by Apple Inc.
  • Apart from the actions defined in the Accessibility API that are directed to a particular accessibility object, the testing device may support more general APIs for simulating touch events at a system level. For example, a button's accessibility object might accept a “press” action, which tells the button to respond as if it were pressed. This approach sends the action directly to the desired user interface object, but bypasses any OS and application components devoted to routing real user events. The system-level input simulation APIs, on the other hand, allow for more user-event-like actions such as: tap at screen location {x=150, y=320}, without regard to what is at that screen location. These events enter the application being tested in essentially the same way that real user events do, providing testing of those same event-routing code paths. The testing mechanism described here can invoke either the system-level input simulation APIs or the Accessibility action APIs as needed, to best simulate the actions specified in the test script.
  • Testing host 204 can host event recorder 208. Event recorder 208 can be configured to receive various outputs of test script 206. The outputs of test script 206 can be submitted to the recorder 208. The recorder can record events submitted from testing device 202. The events can include content of the outputs (e.g., a screen shot, an audio recording, or a notification of a vibration) and metadata. The metadata can include, for example, a time that the outputs occurred, a memory status (e.g., a call stack) at the time the outputs occurred, variable values of application program 214, and other information related to a status of application program 214 and testing device 202.
  • Events recorded by event recorder 208 can be stored in event data store 210. Event analyzer 212 can retrieve data stored in event data store 210 and replay the events. The replay can include presenting various events in a user interface, including screen shots, audio outputs, and physical action of testing device 202. The physical action can be represented using an audio or visual representation. The replay can include displaying various memory states along a timeline.
  • FIGS. 3A-3D illustrate exemplary techniques of testing various touch screen input programs. Similar techniques apply to testing programs that use different touch input components, such as a touch pad. FIG. 3A illustrates exemplary techniques of testing an application program accepting touch input from a user interface element that supports accessibility functions. Accessibility functions can include functions that provide assistance (e.g., enlarged fonts and/or voice readouts) to people with disabilities (e.g., the visually impaired) for accessing the application program.
  • The application program “My App” can execute on mobile device 300 a. A graphical user interface of the application program can be displayed on touch-sensitive screen 302 a of mobile device 300 a. The graphical user interface can include window 304. Window 304 can include virtual button 306 “Configure Network” that accepts a touch input (e.g., a tap). Virtual button 306 can be associated with accessibility features.
  • A test script (e.g., test script 206) can be utilized to test the functions of virtual button 306 and accessibility features associated with virtual button 306. To test a response to a touch input on virtual button 306, the test script can use one or more of the following exemplary functions:
      • hitpoint ( ). The hitpoint function can return a screen position to tap for a specified display element.
      • tap ( ) or tap (object) or tap (location). The tap function can generate a signal simulating a single tap. The tap function can be a method of an identified object, or be performed on the object in the parameter, or be performed at any arbitrary coordinate of the touch-sensitive display. In general, the object in the parameter can include any display or non-display element that can accept a touch input.
      • tapAndHold (duration) or tapAndHold (object, duration) or tapAndHold (location, duration). The tapAndHold function can generate a signal simulating a tap-and-hold. The parameter duration can include a length of time to hold the touch on an element being tapped, in seconds. The default duration value for a tap can be 0. The default value for touch-and-hold gestures (such as drag, pinch open, and pinch close) can be 1.
      • doubleTap ( ) or doubleTap (object) or doubleTap (location). The doubleTap function can generate a signal simulating a double tap.
      • twoFingerTap ( ) or twoFingerTap (object) or twoFingerTap (location). The twoFingerTap function can generate one or more signals simulating a two-finger (or two-stylus) tap.
  • The exemplary functions can be associated with various display elements (e.g., virtual button 306). The display elements can be identified by one or more display element identifiers. For example, the test script can include a function that returns a main display window, use an element ( ) function to retrieve one or more display elements of the main display window, and identify virtual button 306 from the retrieved elements by name, identifier, or class membership.
  • The test script can be utilized to test accessibility functions of virtual button 306. For example, virtual button 306 can be configured to invoke dialog balloon 308 upon receiving a single touch. Dialog balloon 308 can include enlarged text of virtual button 306 as an aid to visually impaired users. Additionally or alternatively, virtual button 306 can be configured to invoke a voice over that includes speech (e.g., synthesized or pre-recorded voice) or a Braille output describing the virtual button 306 upon receiving the single touch. The test script can include functions for capturing dialog balloon 308 (e.g., through a screenshot), the voice over (e.g., through recording), or the Braille output for testing the accessibility functions.
  • FIG. 3B illustrates exemplary techniques of testing an application program accepting touch screen inputs that includes a drag-and-drop. The application program “My App” can execute on mobile device 300 b. The application program is configured to display a graphical user interface in which a user can drag-and-drop a display element 310 by tapping (and holding) on display element 310 using a touching means (finger or stylus); while maintaining contact with display element 310 on a display screen 302 b, moving the touching means into a display area 314; and releasing display element 310. Upon completion of the drag-and-drop, display element 310 can be displayed in display area 314.
  • A test script (e.g., test script 206) can be utilized to test the drag-and-drop functions. To test a response to a drag-and-drop input on display element 310, the test script can use one or more of the following exemplary functions: p1 dragInsideWithOptions (options) or dragInsideWithOptions (object, options) or dragInsideWithOptions (location, options). The draglnsideWithOptions function can be used to generate one or more signals simulating the drag-and-drop input.
  • The options can include a dictionary that specifies characteristics of a gesture (e.g., a movement of a touching means on a touch-sensitive display screen). The characteristics can include one or more of the following:
      • touchCount—a number of touches to use in the specified gesture. For example, the touchCount can represent a number of touching means (fingers or styluses) a user would use to make the specified gesture. The default touchCount value can be 1.
      • duration—a length of time to hold the touch on an element being tapped, in seconds. The default duration value for a tap can be 0. The default value for touch-and-hold gestures (such as drag, pinch open, and pinch close) can be 1.
      • startOffset—a first offset to use in a multiple-point gesture (e.g., a drag-and-drop). The offset can be measured against a current display element within which the drag-and-drop occurs. A default value can be {x:0.0, y:0.0}.
      • endOffset—a last offset to use in a multiple point gesture (e.g., a drag-and-drop). A default value can be {x:0.0, y:0.0}.
  • The test script can use offsets to achieve finer precision in specifying the hitpoint within a rectangle (rect) for the specified display element. The offset can include a pair of values x and y, each ranging from 0.0 to 1.0. The x and y values can represent, respectively, relative horizontal and vertical positions within the rectangle, with {x:0.0, y:0.0} as the top left and {x:1.0, y:1.0} as the bottom right. For example, {x:0.3, y:0.6} can specify a position just below and to the left of a center; and {x:1.0, y:0.5} can specify a position centered vertically at the far right.
  • Additionally or alternatively, the test script can use one or more of the following exemplary functions:
      • dragFromToForDuration (fromPointObject, toPointObject, duration).
  • The fromPointObject parameter can specify a rectangle or point from which the drag action is to begin. The toPointObject parameter can specify a rectangle or point to which the drag action is to end. The duration parameter can specify a length of time, in seconds, between starting and stopping of the drag-and-drop action.
  • The test script can use the function to simulate the drag-and-drop input that drags display element 310 and drops display element 310 into display area 314. A graphical user interface before the drag-and-drop and a graphical user interface after the drag-and-drop can be recorded.
  • FIG. 3C illustrates exemplary techniques of testing an application program accepting touch screen inputs that includes pinching and spreading. The application program “My App” can execute on mobile device 300 c. The application program is configured to display on touch-sensitive screen 302 c a graphical user interface that can accept a pinching input. The pinching input can include a touch screen input where two or more touching means (fingers or styluses) are moving away from each other (pinch-open) or towards each other (pinch-close). The pinch-open and pinch-close inputs can be configured to cause a display element to resize or zoom. For example, display element 326 can be resized to display element 324 upon receiving a pinch-open input. Likewise, display element 324 can be resized to display element 326 upon receiving a pinch-close input. Other functions (e.g., zoom out and zoom in) can be performed in response to the pinch-open and pinch-close inputs.
  • A test script (e.g., test script 206) can be utilized to test the pinching functions. To test a response to a pinch-open input or pinch-close input on display element 324 or 326, the test script can use one or more of the following exemplary functions:
      • pinchOpenFromToForDuration (fromPointObject, toPointObject, duration). This function can generate a signal simulating a pinch-open gesture from a specified starting screen location to a specified ending screen location, for a specified length of time.
      • pinchCloseFromToForDuration (fromPointObject, toPointObject, duration). This function can generate a signal simulating a pinch-close gesture from a specified starting screen location to a specified ending screen location, for a specified length of time.
  • A screenshot of the user interface before and after simulated signals of the pinch-open and pinch-close inputs can be recorded. The screenshot can be analyzed to determine, for example, whether display element 324 or 326 has resized correctly.
  • FIG. 3D illustrates exemplary techniques of testing an application program accepting a flick input. The application program “My App” can execute on mobile device 300 d. The application program is configured to display on touch-sensitive screen 302 d a graphical user interface that can accept a flick input. The flick input can include a touch screen input that includes a quick movement of a touching means (finger or stylus) to a direction. The flick input can be configured to cause a display element to move to a next element. For example, display element 332 (e.g., a first virtual page) can be configured to curl up to reveal another display element (e.g., a second virtual page) underneath. Alternatively or additionally, the display element can move to the left, right, up, or down to reveal a next display element. In an example accessibility feature, a menu item can be voiced over. A flick input can cause a next menu item to be voiced over.
  • A test script (e.g., test script 206) can be utilized to test the flicking functions. To test a response to a flick input on display element 332, the test script can use one or more of the following exemplary functions:
      • flickInsideWithOptions (options) or flickInsideWithOptions (object, options) or flickInsideWithOptions (location, options).
  • The options can include a touchCount, a startOffset, and an endOffset, as described above, and other parameters. The functions can generate one or more signals simulating the flick input.
  • A screenshot of the user interface before the signals simulating the flick input and a screenshot of the user interface after the signals simulating the flick input can be taken. The screenshots can be analyzed to determine, for example, whether display element 332 has moved in accordance with a specified manner (e.g., according to design). A voice recording can be made if the simulated signals of the flick input cause a new menu item to be voiced over.
  • An application program accepting a rotation gesture input can be tested. The application program can execute on a mobile device. The application program is configured to display on a touch-sensitive screen a graphical user interface that can accept a rotation gesture input (a type of touch-and-hold gesture). The rotation gesture input can include a touch screen input that includes a rotating movement of one or more touching means (fingers or styluses) in a clockwise or counterclockwise direction (e.g., rotating a finger while it is pressing on the touch screen). For example, the rotation gesture input can be configured to cause a display element to rotate. As another example, the rotation gesture input can be configured to cause a dial-like display element to turn to a new value.
  • A test script (e.g., test script 206) can be utilized to test the rotation gesture functions. To test a response to a rotation gesture input on a display element, the test script can use one or more of the following exemplary functions:
      • rotateWithOptions (options) or rotateWithOptions (location, options). The rotateWithOptions function can be used to generate one or more signals simulating the rotation gesture input.
  • The options parameter can include a dictionary that specifies characteristics of a rotation gesture (e.g., a movement of one or more touching means on a touch-sensitive display screen). The characteristics can include one or more of the following:
      • touchCount—a number of touches to use in the specified gesture. For example, the touchCount can represent a number of touching means (fingers or styluses) a user would use to make the specified gesture. A valid touchCount value can range from 1 to 5 (inclusive), and the default touchCount value can be 2.
      • duration—a length of time to hold the touch on an element being rotated, in seconds. The default duration value can be 1.
      • rotation—an amount of rotation, in radians. The default value can be pi (π).
      • centerOffset—an offset to use for the center of the rotation gesture. A default value can be {x:0.0, y:0.0}. This characteristic is not used with a rotateWithOptions function that includes a location parameter.
  • The location parameter can include a point object at the center of the rotation gesture, with properties for x and y. The relevant coordinates are screen-relative and are adjusted to account for device orientation.
  • A screenshot of the user interface before the signals simulating the rotate gesture input and a screenshot of the user interface after the signals simulating the rotate gesture input can be taken. The screenshots can be analyzed to determine, for example, whether the display element has moved in accordance with a specified manner (e.g., according to design).
  • FIGS. 4A and 4B illustrate exemplary techniques of testing programs accepting inputs other than touch screen input. FIG. 4A illustrates exemplary techniques of testing an application program accepting a shake input. The application program “My App” can execute on mobile device 400 a. The application program can be configured to provide a response to a motion input of mobile device 400 a. The motion input (e.g., a shake) can be detected using a motion sensor, an angular rate sensor, or an accelerometer. The response can include, for example, a voice over of all input items displayed on a display screen, opening or closing a particular user interface, or turning on or turning off mobile device 400 a.
  • A test script (e.g., test script 206) can be utilized to test the application program in response to a shake input. To test the response to the shake input, the test script can use a shake function, which can provide a signal simulating a shake event of mobile device 400 a. The shake function can have parameters that can be used to specify a direction, magnitude, and duration of the shake. Screenshots of user interfaces before and after the simulated signals of the shake input can be recorded. A voice recording can be made if the simulated signals of the shake input cause a voice over.
  • FIG. 4B illustrates exemplary techniques of testing an application program accepting a change of orientation input. The application program “My App” executes on mobile device 400 b and can be configured to provide a response to a change of orientation of mobile device 400 b. An orientation of mobile device 400 b can include, for example, a portrait mode (e.g., upside up or upside down), a landscape mode (e.g., landscape left or landscape right), and a face mode (e.g., face up or face down). The orientation can additionally or alternatively include an orientation of mobile device 400 b in a global reference frame. For example, the orientation can include a pointing direction or heading of an axis of the mobile device (e.g., “north west” or “bearing 315 degrees”). The change of orientation can be detected using an accelerometer, an angular rate sensor, a motion sensor, a magnetometer, or a combination of the sensors.
  • The application program “My App” executing on mobile device 400 b can alternatively or additionally be configured to provide a response to a change of location of mobile device 400 b. The change of location can be detected using a GPS device, a wireless triangulation device, a baseband processor, a magnetometer, or a combination of the sensors. A test script (e.g., test script 206) can be utilized to test behavior of the application program in response to a change of orientation or change of location. To test a response to a change of orientation or change of location input the test script can use one or more of the following exemplary functions:
      • getDeviceOrientation. This function can retrieve an orientation of mobile device 400 b.
      • setDeviceOrientation (orientation). This function can set an orientation of mobile device 400 b.
      • getDeviceLocation ( ). This function can get a location of mobile device 400 b.
      • setDeviceLocation (location). This function can set a location of mobile device 400 b.
  • The functions can generate one or more signals simulating a change in orientation or location. Screenshots of user interfaces before and after the simulated signals of the change of orientation or change of location can be recorded. A recording can be made if the simulated signals of the change of orientation or change of location input cause a voice over. A change of user interfaces (e.g., a change of a map display or a change of a compass dial display) can be compared against the simulated change of orientation or change of location to determine whether the application program “My App” behaves correctly.
  • FIG. 5 illustrates an exemplary user interface 500 of testing a touch-input program. Exemplary user interface 500 can be displayed on a display device coupled to a testing host (e.g., testing host 204 of FIG. 2) or a testing device (e.g., testing device 202 of FIG. 2).
  • User interface 500 can include instruments section 502. Instruments section 502 can present for display various testing projects (e.g., groups of scripts, application programs, and associated devices). In the example given, a test of a touch-input program to be executed on example device “Device 1” is being conducted.
  • User interface 500 can include script control section 504. Script control section 504 can include control elements that enable a user to select one or more automated test scripts (e.g., test script 206 of FIG. 2). The test scripts can be written using various programming languages (e.g., JavaScript). Script control section 504 can include controls for opening one or more editing windows of the test scripts.
  • User interface 500 can include screenshot control section 506. Screenshot control section 506 can include controls that can configure a time of taking screenshots. The screenshots can be taken and recorded upon request or at predetermined screenshot time intervals or at screenshot points specified in the test script.
  • User interface 500 can include time control sections 508 a and 508 b. Time control sections 508 a and 508 b can include controls for configuring an inspection time range for the testing. The inspection time range can specify a beginning time and an end time for inspecting behavior of the application program being tested. The beginning time and end time can be offsets from the beginning of the application program execution.
  • User interface 500 can include script log section 510. Script log section 510 can include a set of events triggered by the test script. The set of events can be organized by time. A selection of a particular event can cause data related to the event to be displayed in event display section 512.
  • Event display section 512 can display one or more events selected by the user. The event can be displayed in association with event data. The event data can include stack data, variable data (e.g., values of variables at time of the event), and memory status. The event data can include one or more screenshots (e.g., screenshot 514) of the event. For example, the screenshot can include a display element (e.g., display element 324 of FIG. 3C) before a pinch-close input is applied to the display element. Additionally or alternatively, the event data can include an audio clip (e.g., a voice over) associated with an event.
  • As mentioned above, an application program that is configured to receive a touch input can be tested in an automated manner. A testing mechanism queries the application program's user interface (e.g., by sending messages to accessibility objects using the Accessibility API) to identify a touch input that is to produce a specified result. The testing mechanism then generates one or more signals simulating the touch input. These signals are input to the application program (e.g., by sending messages via the Accessibility API to various accessibility objects). The application program responds accordingly, as if a user had actually performed the touch input. For example, the application program modifies its user interface based on the (simulated) touch input. The testing mechanism then queries the application program's user interface again, to determine whether the user interface conforms to the specified result.
  • FIG. 6 is a flowchart illustrating exemplary process 600 of automatically testing a touch-input application program. For convenience, process 600 will be described in reference to a system implementing process 600. The system can include a device that executes the touch-input program, or a computer connected to the device, or both.
  • The system can interrogate (602) the user interface of the application program to identify a touch input operable for producing a specified result in the application program. The application program can include a user interface element that accepts a touch input. In some implementations, the touch input includes a gesture of one or more touching means (fingers or styluses) on a display screen of the device. The user interface element can include one or more accessibility components (e.g., accessibility objects) that describe one or more accessibility attributes of the user interface element. The values of these attributes provide information about the user interface object and can be queried by the system.
  • The system can generate (604) one or more signals simulating the touch input. The one or more signals simulating the touch input can be provided at least in part based on the accessibility attribute described in the accessibility component of the user interface element. For example, the signals can correspond to the tap or flick that is to elicit the specified result.
  • The system interrogates (606) the user interface again, checking for the expected changes. The system can also receive events logged by the script (e.g., failure to simulate the touch input because the specified user interface element does not exist or an explicit script verification of one or more facets of the current user interface state) (not shown). An event can include an accessibility feature that corresponds to the accessibility attribute described in the accessibility component of the user interface element.
  • The system can determine (608) that the actual result conforms to the specified result. Determining that the actual result conforms to the specified result can include comparing screenshots of the actual result with design specifications of the touch-input program.
  • In some implementations, the system can provide a library of signals simulating various touch inputs. A test script for testing the touch-input application program can include serial or parallel calls to the library to generate one or more signals simulating sequential or concurrent touch inputs.
  • FIG. 7 is a flowchart illustrating an exemplary process of testing a touch-input program in an exemplary testing environment.
  • A testing host can receive (702) a first event. The first event can include the test script logging a simulated touch input to an application program executing on a device. The device can be a device connected to the testing host through a communications network (e.g., a local area network (LAN) or a wide area network (WAN)). The first event can be generated from a test script. The simulated touch input for the application program can be produced by the testing script, or by a testing tool kit in response to an execution of the testing script. The simulated touch input can include a single touch input, a multi touch input, a gesture, or a physical movement of the device. The simulated user input is associated with a timestamp.
  • The testing host can receive (704) a second event from the test script, verifying that the application program user interface is in an expected state following the touch input. The event can include a screenshot of the application program.
  • The testing host can acquire (706) a status of the device. The status can correspond to the response to the simulated touch input. The status can include a call stack, a memory status, including memory failure status, or a memory leak status.
  • The testing host can provide (708) for display a monitoring interface. Providing for display the monitoring interface includes providing for display, based on a timestamp, the visual representation of the first event, the visual representation of the second event, and visual representations of other events in a timeline.
  • Exemplary Mobile Device Architecture
  • FIG. 8 is a block diagram illustrating exemplary device architecture 800 of a mobile device implementing the features and operations described in reference to FIGS. 1-7. The mobile device can be, for example, a handheld computer, a personal digital assistant, a cellular telephone, an electronic tablet, a network appliance, a camera, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a network base station, a media player, a navigation device, an email device, a game console, or a combination of any two or more of these data processing devices or other data processing devices.
  • The mobile device can include memory interface 802, one or more data processors, image processors and/or processors 804, and peripherals interface 806. Memory interface 802, one or more processors 804 and/or peripherals interface 806 can be separate components or can be integrated in one or more integrated circuits. Processors 804 can include one or more application processors (APs) and one or more baseband processors (BPs). The application processors and baseband processors can be integrated in one single process chip. The various components in the mobile device, for example, can be coupled by one or more communication buses or signal lines.
  • Sensors, devices, and subsystems can be coupled to peripherals interface 806 to facilitate multiple functionalities. For example, motion sensor 810, light sensor 812, and proximity sensor 814 can be coupled to peripherals interface 806 to facilitate orientation, lighting, and proximity functions of the mobile device. Location processor 815 (e.g., GPS receiver) can be connected to peripherals interface 806 to provide geopositioning. Electronic magnetometer 816 (e.g., an integrated circuit chip) can also be connected to peripherals interface 806 to provide data that can be used to determine the direction of magnetic North. Thus, electronic magnetometer 816 can be used as an electronic compass. Accelerometer 817 can also be connected to peripherals interface 806 to provide data that can be used to determine change of speed and direction of movement of the mobile device. An angular rate sensor (e.g., a Micro-Electro-Mechanical System (MEMS) gyro) can be connected to peripherals interface 806 to provide data that can be used to determine a rotational velocity of the mobile device.
  • Camera subsystem 820 and an optical sensor 822, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips.
  • Communication functions can be facilitated through one or more wireless communication subsystems 824, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of the communication subsystem 824 can depend on the communication network(s) over which a mobile device is intended to operate. For example, a mobile device can include communication subsystems 824 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax network, and a Bluetooth network. In particular, the wireless communication subsystems 824 can include hosting protocols such that the mobile device can be configured as a base station for other wireless devices.
  • Audio subsystem 826 can be coupled to a speaker 828 and a microphone 830 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
  • I/O subsystem 840 can include touch screen controller 842 and/or other input controller(s) 844. Touch-screen controller 842 can be coupled to a touch screen 846 or pad. Touch screen 846 and touch screen controller 842 can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 846.
  • Other input controller(s) 844 can be coupled to other input/control devices 848, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control of speaker 828 and/or microphone 830.
  • In one implementation, a pressing of the button for a first duration may disengage a lock of the touch screen 846; and a pressing of the button for a second duration that is longer than the first duration may turn power to the mobile device on or off. The user may be able to customize a functionality of one or more of the buttons. The touch screen 846 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.
  • In some implementations, the mobile device can present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations, the mobile device can include the functionality of an MP3 player. The mobile device may, therefore, include a pin connector that is compatible with the MP3 player. Other input/output and control devices can also be used.
  • Memory interface 802 can be coupled to memory 850. Memory 850 can include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). Memory 850 can store operating system 852, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks. Operating system 852 may include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, operating system 852 can include a kernel (e.g., UNIX kernel).
  • Memory 850 may also store communication instructions 854 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. Memory 850 may include graphical user interface instructions 856 to facilitate graphic user interface processing; sensor processing instructions 858 to facilitate sensor-related processing and functions; phone instructions 860 to facilitate phone-related processes and functions; electronic messaging instructions 862 to facilitate electronic-messaging related processes and functions; web browsing instructions 864 to facilitate web browsing-related processes and functions; media processing instructions 866 to facilitate media processing-related processes and functions; GPS/Navigation instructions 868 to facilitate GPS and navigation-related processes and instructions; camera instructions 870 to facilitate camera-related processes and functions; magnetometer data 872 and calibration instructions 874 to facilitate magnetometer calibration. The memory 850 may also store other software instructions (not shown), such as security instructions, web video instructions to facilitate web video-related processes and functions, and/or web shopping instructions to facilitate web shopping-related processes and functions. In some implementations, the media processing instructions 866 are divided into audio processing instructions and video processing instructions to facilitate audio processing-related processes and functions and video processing-related processes and functions, respectively. An activation record and International Mobile Equipment Identity (IMEI) or similar hardware identifier can also be stored in memory 850. Memory 850 can include testing instructions 876 that can perform one or more functions as described above.
  • Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules. Memory 850 can include additional instructions or fewer instructions. Furthermore, various functions of the mobile device may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
  • Exemplary System Architecture
  • FIG. 9 is a block diagram of an exemplary system architecture 900 for implementing the features and operations described in reference to FIGS. 1-8. Other architectures are possible, including architectures with more or fewer components. In some implementations, architecture 900 includes one or more processors 902 (e.g., dual-core Intel® Xeon® Processors), one or more output devices 904 (e.g., LCD), one or more network interfaces 906, one or more input devices 908 (e.g., mouse, keyboard, touch-sensitive display, touch pad, touch switch) and one or more computer-readable mediums 912 (e.g., RAM, ROM, SDRAM, hard disk, optical disk, flash memory, etc.). These components can exchange communications and data over one or more communication channels 910 (e.g., buses), which can utilize various hardware and software for facilitating the transfer of data and control signals between components.
  • The term “computer-readable medium” refers to any medium that participates in providing instructions to processor 902 for execution, including without limitation, non-volatile media (e.g., optical or magnetic disks), volatile media (e.g., memory) and transmission media. Transmission media includes, without limitation, coaxial cables, copper wire and fiber optics.
  • Computer-readable medium 912 can further include operating system 914 (e.g., Mac OS® server, Windows® NT server), network communication module 916, database interface 920, test script 930, event recorder 940, event data store 950, and event analyzer 960, as described in reference to FIGS. 1-8. Operating system 914 can be multi-user, multiprocessing, multitasking, multithreading, real time, etc. Operating system 914 performs basic tasks, including but not limited to: recognizing input from and providing output to devices 908, 904; keeping track of and managing files and directories on computer-readable mediums 912 (e.g., memory or a storage device); controlling peripheral devices; and managing traffic on the one or more communication channels 910. Network communications module 916 includes various components for establishing and maintaining network connections (e.g., software for implementing communication protocols, such as TCP/IP, HTTP, etc.). Database interface 920 can include interface to various data stores such as event data store 950.
  • Architecture 900 can be included in any device capable of hosting a database application program. Architecture 900 can be implemented in a parallel processing or peer-to-peer infrastructure or on a single device with one or more processors. Software can include multiple software components or can be a single body of code.
  • The described features can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. A computer program is a set of instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result. A computer program can be written in any form of programming language (e.g., Objective-C, Java), including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors or cores, of any kind of computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data. Generally, a computer will also include, or be operatively coupled to communicate with, one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
  • To provide for interaction with a user, the features can be implemented on a computer having a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the user and a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer.
  • The features can be implemented in a computer system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them. The components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks forming the Internet.
  • The computer system can include clients and servers. A client and server are generally remote from each other and typically interact through a network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • A number of implementations have been described. Nevertheless, it will be understood that various modifications can be made without departing from the spirit and scope of the disclosure. For example, pseudo code of exemplary functions is provided. Functions that implement the features described in this specification can have any names or parameters. Accordingly, other implementations are within the scope of the following claims.

Claims (20)

1. A method, comprising:
identifying a touch input operable for producing a specified result in an application program;
generating one or more signals simulating the touch input;
receiving, in response to the one or more signals, a result from the application program; and
determining that the received result conforms with the specified result.
2. The method of claim 1, where the application program includes a user interface element that accepts the touch input.
3. The method of claim 2, where the user interface element includes an accessibility component that describes an accessibility attribute of the user interface element.
4. The method of claim 3, where the one or more signals simulating the touch input are provided at least in part based on the accessibility attribute described in the accessibility component of the user interface element.
5. The method of claim 3, where the received result includes an accessibility feature that corresponds to the accessibility attribute described in the accessibility component of the user interface element.
6. The method of claim 1, where the touch input includes a gesture of one or more touching means on a touch input component.
7. The method of claim 6, where the touch input component comprises a touch-sensitive display or a touch pad.
8. The method of claim 1, further comprising:
providing a library of signals simulating touch inputs; and
providing a script that includes calls to the library.
9. A method executed on a computer, comprising:
receiving a first event, the first event including a simulated touch input to an application program executing on a device;
receiving a second event from the device, the second event including a response to the simulated touch input, the response produced by a verification step performed by a test script;
acquiring a status of the device, the status corresponding to the response to the simulated touch input produced; and
providing for display a monitoring interface, the monitoring interface including a visual representation of the first event, a visual representation of the second event, and one or more visual representations of the status associated with the second event.
10. The method of claim 9, where the device is a mobile device connected to the computer through a wide area network (WAN).
11. The method of claim 9, further comprising:
executing the test script, the test script producing the simulated touch input to the application program.
12. The method of claim 9, where acquiring the status of the device comprises:
acquiring a memory status, the memory status including at least one element of a group containing a call stack, a memory failure status, and a memory leak status.
13. The method of claim 9, where receiving the second event from the device includes receiving a screenshot from the device.
14. The method of claim 9, where the simulated touch input includes a simulated multi-touch input.
15. The method of claim 9, where:
the simulated touch input is associated with a timestamp; and
providing for display the monitoring interface includes providing for display, based on the timestamp, the visual representation of the first event, the visual representation of the second event, and visual representations of other events in a timeline.
16. A computer program product tangibly stored on a storage device, operable to cause data processing apparatus to perform operations comprising:
identifying a touch input operable for producing a specified result in an application program;
generating one or more signals simulating the touch input; and
determining whether the user interface conforms to the specified result.
17. The product of claim 16, where the application program includes a user interface element that accepts the touch input.
18. The product of claim 17, where the user interface element includes an accessibility component that describes an accessibility attribute of the user interface element.
19. The product of claim 18, where the one or more signals simulating the touch input are provided at least in part based on the accessibility attribute described in the accessibility component of the user interface element.
20. A system, comprising:
one or more computers configured to perform operations comprising:
identifying a touch input operable for producing a specified result in an application program;
generating one or more signals simulating the touch input; and
determining whether the user interface conforms with the specified result.
US13/165,672 2010-06-21 2011-06-21 Testing a Touch-Input Program Abandoned US20110310041A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/165,672 US20110310041A1 (en) 2010-06-21 2011-06-21 Testing a Touch-Input Program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US35709010P 2010-06-21 2010-06-21
US13/165,672 US20110310041A1 (en) 2010-06-21 2011-06-21 Testing a Touch-Input Program

Publications (1)

Publication Number Publication Date
US20110310041A1 true US20110310041A1 (en) 2011-12-22

Family

ID=45328187

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/164,694 Active 2032-08-11 US8966447B2 (en) 2010-06-21 2011-06-20 Capturing and displaying state of automated user-level testing of a graphical user interface application
US13/165,672 Abandoned US20110310041A1 (en) 2010-06-21 2011-06-21 Testing a Touch-Input Program

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/164,694 Active 2032-08-11 US8966447B2 (en) 2010-06-21 2011-06-20 Capturing and displaying state of automated user-level testing of a graphical user interface application

Country Status (1)

Country Link
US (2) US8966447B2 (en)

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120173973A1 (en) * 2010-12-29 2012-07-05 Kunihiro Miyauchi User interface device, image forming apparatus, user interface control method, and computer program product
US20120204155A1 (en) * 2011-02-08 2012-08-09 Particle Code, Inc. Systems and Methods for Interactive Testing of a Computer Application
US20130086999A1 (en) * 2010-06-22 2013-04-11 Janne Pitkanen Apparatus and method for testing usability
US8436829B1 (en) * 2012-01-31 2013-05-07 Google Inc. Touchscreen keyboard simulation for performance evaluation
US20130113761A1 (en) * 2011-06-17 2013-05-09 Polymer Vision B.V. Electronic device with a touch sensitive panel, method for operating the electronic device, and display system
US20130159555A1 (en) * 2011-12-20 2013-06-20 Microsoft Corporation Input commands
US20130232402A1 (en) * 2012-03-01 2013-09-05 Huawei Technologies Co., Ltd. Method for Processing Sensor Data and Computing Node
US20130342469A1 (en) * 2012-06-21 2013-12-26 Microsoft Corporation Touch intensity based on accelerometer readings
US20140035853A1 (en) * 2012-08-06 2014-02-06 Samsung Electronics Co., Ltd. Method and apparatus for providing user interaction based on multi touch finger gesture
US20140045483A1 (en) * 2012-08-08 2014-02-13 Nokia Corporation Methods, apparatuses and computer program products for automating testing of devices
US20140089904A1 (en) * 2012-09-26 2014-03-27 Compuware Corporation Technique for simulating an orientation change event in a test environment
US20140132571A1 (en) * 2012-11-12 2014-05-15 Sap Ag Automated testing of gesture-based applications
CN103853663A (en) * 2014-03-25 2014-06-11 北京金山网络科技有限公司 Application program test method and system
US20150088722A1 (en) * 2013-09-26 2015-03-26 Trading Technologies International, Inc. Methods and Apparatus to Implement Spin-Gesture Based Trade Action Parameter Selection
CN104571670A (en) * 2013-10-09 2015-04-29 纬创资通股份有限公司 Testing method of touch device and system thereof
US9075781B2 (en) 2013-03-15 2015-07-07 Apkudo, Llc System and method for coordinating field user testing results for a mobile application across various mobile devices
TWI510918B (en) * 2014-02-20 2015-12-01 Wistron Corp Method and system for quick testing and detectiing mobile devices
TWI510913B (en) * 2013-11-12 2015-12-01 Inst Information Industry Testing device and testing method thereof
US9283672B1 (en) 2014-12-11 2016-03-15 Apkudo, Llc Robotic testing device and method for more closely emulating human movements during robotic testing of mobile devices
US9395890B2 (en) 2013-05-15 2016-07-19 Microsoft Technology Licensing, Llc Automatic discovery of system behavior
US20160239943A1 (en) * 2013-09-13 2016-08-18 Hewlett-Packard Development Company, L.P. Screen orientation
AU2013248815B2 (en) * 2012-04-16 2016-08-25 Tencent Technology (Shenzhen) Company Limited Instruction triggering method and device, user information acquisition method and system, terminal, and server
US9578133B2 (en) 2012-12-03 2017-02-21 Apkudo, Llc System and method for analyzing user experience of a software application across disparate devices
US9628844B2 (en) 2011-12-09 2017-04-18 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US9788032B2 (en) 2012-05-04 2017-10-10 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
RU2640638C2 (en) * 2015-04-24 2018-01-10 Общество С Ограниченной Ответственностью "Яндекс" Method and electronic device for e-mail message processing based on interaction with user
US20180165258A1 (en) * 2016-12-12 2018-06-14 Usablenet Inc. Methods for improved auditing of web sites and devices thereof
US10060801B2 (en) * 2015-03-23 2018-08-28 Nok9 Ab Testing device for wireless power transfer and associated method
CN108459764A (en) * 2017-02-15 2018-08-28 霍尼韦尔国际公司 Touch detection device with code debugging device
WO2018204345A1 (en) * 2017-05-02 2018-11-08 Soroco Private Limited Systems and methods for detecting anomalies in execution of computer programs
US10261611B2 (en) * 2012-12-03 2019-04-16 Apkudo, Llc System and method for objectively measuring user experience of touch screen based devices
US10409455B2 (en) * 2012-12-17 2019-09-10 Telecom Italia S.P.A. Selection system for an interactive display
US20200019292A1 (en) * 2016-09-30 2020-01-16 Sap Se Synchronized calendar and timeline adaptive user interface
US10747656B2 (en) * 2018-11-27 2020-08-18 Fmr Llc Systems and methods for mobile automation testing by emulating human behaviors
US10990359B2 (en) * 2019-05-24 2021-04-27 Sap Se Use and advancements of assistive technology in automation for the visually-impaired workforce
US11080170B1 (en) 2015-12-07 2021-08-03 Mx Technologies, Inc. Multi-platform testing automation
US11087581B2 (en) * 2019-11-25 2021-08-10 Igt Correctly interpreting failed touch input using gesture input at gaming devices, and related devices, systems, and methods
US11182853B2 (en) 2016-06-27 2021-11-23 Trading Technologies International, Inc. User action for continued participation in markets
US20210382564A1 (en) * 2015-06-16 2021-12-09 Snap Inc. Radial gesture navigation
CN114020202A (en) * 2021-11-05 2022-02-08 上海怿星电子科技有限公司 Electronic equipment testing method and system
US11422696B2 (en) * 2019-02-25 2022-08-23 Micro Focus Llc Representation of user interface interactive regions
US11435895B2 (en) 2013-12-28 2022-09-06 Trading Technologies International, Inc. Methods and apparatus to enable a trading device to accept a user input

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8543980B2 (en) * 2010-08-23 2013-09-24 Micro Focus (Us), Inc. State driven testing
DE102011008277B4 (en) * 2011-01-11 2017-01-12 Brose Fahrzeugteile Gmbh & Co. Kommanditgesellschaft, Bamberg Sensor unit for contactless actuation of a vehicle door
US8799769B2 (en) 2011-02-08 2014-08-05 Ebay Inc. Application above-the-fold rendering measurements
US8719795B2 (en) * 2011-04-12 2014-05-06 Miami International Security Exchange, Llc System and method for automating testing of computers
US9256510B2 (en) * 2011-06-09 2016-02-09 International Business Machines Corporation Automatic rules based capturing of graphical objects for specified applications
US8793578B2 (en) * 2011-07-11 2014-07-29 International Business Machines Corporation Automating execution of arbitrary graphical interface applications
US10983947B2 (en) * 2011-11-21 2021-04-20 Robert Keith Mykland Method and dynamically reconfigurable processor adapted for management of persistence of information across multiple instruction cycles
US9135714B1 (en) * 2011-11-28 2015-09-15 Innovative Defense Technologies, LLC Method and system for integrating a graphical user interface capture for automated test and retest procedures
CN103365641B (en) * 2012-03-31 2016-05-11 国际商业机器公司 A kind of method for designing of GUI developing instrument and system
US20130275946A1 (en) * 2012-04-16 2013-10-17 Oracle International Corporation Systems and methods for test development process automation for a test harness
US20140033179A1 (en) * 2012-07-30 2014-01-30 Hewlett-Packard Development Company Lp Application testing
US8925076B2 (en) 2012-12-11 2014-12-30 Kaspersky Lab Zao Application-specific re-adjustment of computer security settings
US8954546B2 (en) 2013-01-25 2015-02-10 Concurix Corporation Tracing with a workload distributor
US9256969B2 (en) 2013-02-01 2016-02-09 Microsoft Technology Licensing, Llc Transformation function insertion for dynamically displayed tracer data
US9323863B2 (en) 2013-02-01 2016-04-26 Microsoft Technology Licensing, Llc Highlighting of time series data on force directed graph
US20130232452A1 (en) * 2013-02-01 2013-09-05 Concurix Corporation Force Directed Graph with Time Series Data
US9021447B2 (en) 2013-02-12 2015-04-28 Concurix Corporation Application tracing by distributed objectives
US20130283281A1 (en) 2013-02-12 2013-10-24 Concurix Corporation Deploying Trace Objectives using Cost Analyses
US8843901B2 (en) 2013-02-12 2014-09-23 Concurix Corporation Cost analysis for selecting trace objectives
US8924941B2 (en) 2013-02-12 2014-12-30 Concurix Corporation Optimization analysis using similar frequencies
US8997063B2 (en) 2013-02-12 2015-03-31 Concurix Corporation Periodicity optimization in an automated tracing system
US9372777B2 (en) * 2013-02-28 2016-06-21 International Business Machines Corporation Collecting and attaching a bug trace to a problem information technology ticket
US20140258894A1 (en) * 2013-03-05 2014-09-11 Research In Motion Limited Visual Timeline Of An Application History
US9158518B2 (en) 2013-03-11 2015-10-13 Blackberry Limited Collaborative application development environment using a connected device
US9773264B2 (en) 2013-03-26 2017-09-26 Blackberry Limited Method for providing composite user interface controls and an online storefront for same
US10735496B2 (en) 2013-04-01 2020-08-04 Autodesk, Inc. Server side video screen capture
US9575874B2 (en) 2013-04-20 2017-02-21 Microsoft Technology Licensing, Llc Error list and bug report analysis for configuring an application tracer
CN104123219B (en) * 2013-04-28 2017-05-24 国际商业机器公司 Method and device for testing software
US9734040B2 (en) 2013-05-21 2017-08-15 Microsoft Technology Licensing, Llc Animated highlights in a graph representing an application
US8990777B2 (en) 2013-05-21 2015-03-24 Concurix Corporation Interactive graph for navigating and monitoring execution of application code
CN104216691B (en) * 2013-05-31 2017-11-17 华为技术有限公司 A kind of method and device for creating application
US9280841B2 (en) 2013-07-24 2016-03-08 Microsoft Technology Licensing, Llc Event chain visualization of performance data
US9292415B2 (en) 2013-09-04 2016-03-22 Microsoft Technology Licensing, Llc Module specific tracing in a shared module environment
US8881111B1 (en) 2013-09-17 2014-11-04 Xamarin Inc. Testing user interface responsiveness for mobile applications
US8856748B1 (en) * 2013-09-17 2014-10-07 Xamarin Inc. Mobile application testing platform
US9772927B2 (en) 2013-11-13 2017-09-26 Microsoft Technology Licensing, Llc User interface for selecting tracing origins for aggregating classes of trace data
WO2015071777A1 (en) 2013-11-13 2015-05-21 Concurix Corporation Software component recommendation based on multiple trace runs
US20150149024A1 (en) * 2013-11-22 2015-05-28 Sikorsky Aircraft Corporation Latency tolerant fault isolation
US11521229B2 (en) * 2014-01-09 2022-12-06 Xandr Inc. Systems and methods for mobile advertisement review
US9519570B2 (en) 2014-03-19 2016-12-13 International Business Machines Corporation Progressive snapshots in automated software testing
US9703770B2 (en) 2014-03-19 2017-07-11 International Business Machines Corporation Automated validation of the appearance of graphical user interfaces
CN104077210B (en) * 2014-06-06 2017-06-06 百度在线网络技术(北京)有限公司 The localization method and system of a kind of client collapse
US10445166B2 (en) * 2014-06-24 2019-10-15 International Business Machines Corporation System verification of interactive screenshots and log files between client systems and server systems within a network computing environment
US9400737B2 (en) * 2014-08-07 2016-07-26 International Business Machines Corporation Generation of automated unit tests for a controller layer system and method
US10430309B2 (en) * 2015-02-23 2019-10-01 Red Hat, Inc. Duplicating a task sequence from a graphical user interface interaction for a development application in view of trace data
US9459780B1 (en) * 2015-04-29 2016-10-04 Axure Software Solutions, Inc. Documenting interactive graphical designs
US10055340B2 (en) 2015-06-10 2018-08-21 International Business Machines Corporation Dynamic test topology visualization
CN104965791B (en) * 2015-07-20 2017-09-29 上海斐讯数据通信技术有限公司 A kind of method tested Android platform application program
US10175960B2 (en) 2015-11-13 2019-01-08 International Business Machines Corporation User interface area coverage
US10268561B2 (en) * 2016-02-22 2019-04-23 International Business Machines Corporation User interface error prediction
US10417113B1 (en) * 2016-03-10 2019-09-17 Amdocs Development Limited System, method, and computer program for web testing and automation offline storage and analysis
US10324828B2 (en) * 2016-03-28 2019-06-18 Dropbox, Inc. Generating annotated screenshots based on automated tests
US10331540B2 (en) * 2016-03-30 2019-06-25 International Business Machines Corporation Identifying false positive automated tests
US9823953B2 (en) * 2016-04-04 2017-11-21 Bank Of America Corporation Interprogram communication messaging for program synchronization
US10073766B2 (en) * 2016-08-25 2018-09-11 Entit Software Llc Building signatures of application flows
US9870314B1 (en) * 2016-12-12 2018-01-16 Red Hat, Inc. Update testing by build introspection
US9934129B1 (en) * 2017-03-17 2018-04-03 Google Llc Determining application test results using screenshot metadata
KR101956719B1 (en) * 2017-09-29 2019-06-19 (주) 피플아이 Method for producing package software
CN108492342B (en) * 2018-03-22 2022-05-03 网易(杭州)网络有限公司 Method, device, processor, storage medium and terminal for merging broken graphs
US10922162B2 (en) * 2018-06-13 2021-02-16 Dell Products, L.P. Capturing video data and serial data during an information handling system failure
CN109828906B (en) * 2018-12-15 2023-07-04 中国平安人寿保险股份有限公司 UI (user interface) automatic testing method and device, electronic equipment and storage medium
WO2023111685A1 (en) * 2021-12-13 2023-06-22 RealityMine Limited A system for simultaneous recording of the pixels of a screen and of accessibility data

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040064593A1 (en) * 2002-09-30 2004-04-01 Microsoft Corporation Accessibility system and method
US20050166094A1 (en) * 2003-11-04 2005-07-28 Blackwell Barry M. Testing tool comprising an automated multidimensional traceability matrix for implementing and validating complex software systems
US20050223360A1 (en) * 2004-03-31 2005-10-06 Bea Systems, Inc. System and method for providing a generic user interface testing framework
US20060005132A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Smart UI recording and playback framework
US20070037521A1 (en) * 2005-04-18 2007-02-15 Alex Babut System and method of testing wireless component applications
US20080001923A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Input Simulation System For Touch Based Devices
US20090077422A1 (en) * 2007-09-19 2009-03-19 Sunil Khaladkar Method and system for accelerating test automation of software applications
US20090265689A1 (en) * 2008-04-16 2009-10-22 Microsoft Corporation Generic validation test famework for graphical user interfaces
US20100095234A1 (en) * 2008-10-07 2010-04-15 Research In Motion Limited Multi-touch motion simulation using a non-touch screen computer input device
US8171406B1 (en) * 2009-08-19 2012-05-01 Symantec Corporation Automating user interface navigation
US20130120280A1 (en) * 2010-05-28 2013-05-16 Tim Kukulski System and Method for Evaluating Interoperability of Gesture Recognizers

Family Cites Families (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4914568A (en) * 1986-10-24 1990-04-03 National Instruments, Inc. Graphical system for modelling a process and associated method
US5657438A (en) * 1990-11-27 1997-08-12 Mercury Interactive (Israel) Ltd. Interactive system for developing tests of system under test allowing independent positioning of execution start and stop markers to execute subportion of test script
US5432940A (en) * 1992-11-02 1995-07-11 Borland International, Inc. System and methods for improved computer-based training
US5600789A (en) * 1992-11-19 1997-02-04 Segue Software, Inc. Automated GUI interface testing
CA2190043C (en) * 1996-11-12 2001-10-16 Don E. Hameluck Buffered screen capturing software tool usability testing of computer applications
US6332212B1 (en) * 1997-10-02 2001-12-18 Ltx Corporation Capturing and displaying computer program execution timing
US6493868B1 (en) * 1998-11-02 2002-12-10 Texas Instruments Incorporated Integrated development tool
US6611276B1 (en) 1999-08-31 2003-08-26 Intel Corporation Graphical user interface that displays operation of processor threads over time
US20020087949A1 (en) * 2000-03-03 2002-07-04 Valery Golender System and method for software diagnostics using a combination of visual and dynamic tracing
US20030121027A1 (en) * 2000-06-23 2003-06-26 Hines Kenneth J. Behavioral abstractions for debugging coordination-centric software designs
US6785893B2 (en) * 2000-11-30 2004-08-31 Microsoft Corporation Operating system event tracker having separate storage for interrupt and non-interrupt events and flushing the third memory when timeout and memory full occur
US7062749B2 (en) * 2000-12-15 2006-06-13 Promenix, Inc. Measuring, monitoring and tracking enterprise communications and processes
US6769054B1 (en) * 2001-02-26 2004-07-27 Emc Corporation System and method for preparation of workload data for replaying in a data storage environment
CA2347647A1 (en) * 2001-05-15 2002-11-15 Ibm Canada Limited-Ibm Canada Limitee Storing and restoring snapshots of a computer process
US7802236B2 (en) * 2002-09-09 2010-09-21 The Regents Of The University Of California Method and apparatus for identifying similar regions of a program's execution
US7310777B2 (en) * 2002-10-18 2007-12-18 Computer Associates Think, Inc. User interface for viewing performance information about transactions
US7870431B2 (en) * 2002-10-18 2011-01-11 Computer Associates Think, Inc. Transaction tracer
US7020573B2 (en) * 2004-01-12 2006-03-28 Microsoft Corporation Enhanced testing for compliance with universal plug and play protocols
US7478365B2 (en) * 2004-01-13 2009-01-13 Symphony Services Corp. Method and system for rule-based generation of automation test scripts from abstract test case representation
US7379600B2 (en) * 2004-01-28 2008-05-27 Microsoft Corporation Method and system for automatically determining differences in a user interface throughout a development cycle
US7184918B2 (en) * 2004-04-01 2007-02-27 Techsmith Corporation Automated system and method for conducting usability testing
US7627821B2 (en) * 2004-06-15 2009-12-01 Microsoft Corporation Recording/playback tools for UI-based applications
US20060026467A1 (en) * 2004-07-30 2006-02-02 Smadar Nehab Method and apparatus for automatically discovering of application errors as a predictive metric for the functional health of enterprise applications
US7743361B2 (en) * 2004-09-20 2010-06-22 The Mathworks, Inc. Providing block state information for a model based development process
US20060212324A1 (en) * 2005-02-22 2006-09-21 Transparency Software, Inc. Graphical representation of organization actions
US7472378B2 (en) * 2005-02-23 2008-12-30 International Business Machines Corporation Breakpoint management and reconciliation for embedded scripts in a business integration language specified program process
US7444574B2 (en) * 2005-02-24 2008-10-28 International Business Machines Corporation Stimulus extraction and sequence generation for an electric device under test
US7698686B2 (en) 2005-04-15 2010-04-13 Microsoft Corporation Method and apparatus for performance analysis on a software program
US7702958B2 (en) * 2005-05-24 2010-04-20 Alcatel-Lucent Usa Inc. Auto-recording tool for developing test harness files
US7958488B2 (en) * 2005-08-16 2011-06-07 National Instruments Corporation Virtual testing in a development environment
US7694181B2 (en) * 2005-12-12 2010-04-06 Archivas, Inc. Automated software testing framework
US7496627B2 (en) * 2006-03-16 2009-02-24 Exceptional Innovation, Llc Automation control system having digital logging
US8271962B2 (en) * 2006-09-12 2012-09-18 Brian Muller Scripted interactive screen media
US8239831B2 (en) * 2006-10-11 2012-08-07 Micro Focus (Ip) Limited Visual interface for automated software testing
US8429613B2 (en) * 2006-10-31 2013-04-23 Microsoft Corporation Stepping and application state viewing between points
US8079022B2 (en) * 2007-06-04 2011-12-13 Carbon Design Systems, Inc. Simulation of software
US20090150868A1 (en) 2007-12-10 2009-06-11 Al Chakra Method and System for Capturing Movie Shots at the Time of an Automated Graphical User Interface Test Failure
US7840851B2 (en) 2008-02-15 2010-11-23 Red Hat, Inc. Annotating GUI test automation playback and debugging
US8019588B1 (en) * 2008-05-27 2011-09-13 Adobe Systems Incorporated Methods and systems to compare screen captures from emulated devices under test
US7877642B2 (en) * 2008-10-22 2011-01-25 International Business Machines Corporation Automatic software fault diagnosis by exploiting application signatures
US8402318B2 (en) * 2009-03-24 2013-03-19 The Trustees Of Columbia University In The City Of New York Systems and methods for recording and replaying application execution
US20100318312A1 (en) * 2009-06-12 2010-12-16 Nvidia Corporation Simplifying determination of whether a display controller provides video output with desired quality
US8392887B2 (en) * 2009-06-15 2013-03-05 Sas Institute Inc. Systems and methods for identifying graphic user-interface components
US8271950B2 (en) * 2009-07-06 2012-09-18 Microsoft Corporation Test generation from captured user interface status

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040064593A1 (en) * 2002-09-30 2004-04-01 Microsoft Corporation Accessibility system and method
US20050166094A1 (en) * 2003-11-04 2005-07-28 Blackwell Barry M. Testing tool comprising an automated multidimensional traceability matrix for implementing and validating complex software systems
US20050223360A1 (en) * 2004-03-31 2005-10-06 Bea Systems, Inc. System and method for providing a generic user interface testing framework
US20060005132A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Smart UI recording and playback framework
US20070037521A1 (en) * 2005-04-18 2007-02-15 Alex Babut System and method of testing wireless component applications
US20080001923A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Input Simulation System For Touch Based Devices
US20090077422A1 (en) * 2007-09-19 2009-03-19 Sunil Khaladkar Method and system for accelerating test automation of software applications
US20090265689A1 (en) * 2008-04-16 2009-10-22 Microsoft Corporation Generic validation test famework for graphical user interfaces
US20100095234A1 (en) * 2008-10-07 2010-04-15 Research In Motion Limited Multi-touch motion simulation using a non-touch screen computer input device
US8171406B1 (en) * 2009-08-19 2012-05-01 Symantec Corporation Automating user interface navigation
US20130120280A1 (en) * 2010-05-28 2013-05-16 Tim Kukulski System and Method for Evaluating Interoperability of Gesture Recognizers

Cited By (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130086999A1 (en) * 2010-06-22 2013-04-11 Janne Pitkanen Apparatus and method for testing usability
US9201503B2 (en) * 2010-12-29 2015-12-01 Ricoh Company, Limited User interface device, image forming apparatus, user interface control method, and computer program product
US20120173973A1 (en) * 2010-12-29 2012-07-05 Kunihiro Miyauchi User interface device, image forming apparatus, user interface control method, and computer program product
US8650544B2 (en) * 2011-02-08 2014-02-11 Beek Fund B.V. L.L.C. Systems and methods for interactive testing of a computer application
US20120204155A1 (en) * 2011-02-08 2012-08-09 Particle Code, Inc. Systems and Methods for Interactive Testing of a Computer Application
US20130113761A1 (en) * 2011-06-17 2013-05-09 Polymer Vision B.V. Electronic device with a touch sensitive panel, method for operating the electronic device, and display system
US9013453B2 (en) * 2011-06-17 2015-04-21 Creator Technology B.V. Electronic device with a touch sensitive panel, method for operating the electronic device, and display system
US10798438B2 (en) 2011-12-09 2020-10-06 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US9628844B2 (en) 2011-12-09 2017-04-18 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US20130159555A1 (en) * 2011-12-20 2013-06-20 Microsoft Corporation Input commands
US8436829B1 (en) * 2012-01-31 2013-05-07 Google Inc. Touchscreen keyboard simulation for performance evaluation
US20130232402A1 (en) * 2012-03-01 2013-09-05 Huawei Technologies Co., Ltd. Method for Processing Sensor Data and Computing Node
US9454234B2 (en) 2012-04-16 2016-09-27 Tencent Technology (Shenzhen) Company Limited Instruction triggering method and device, user information acquisition method and system, terminal, and server
AU2013248815B2 (en) * 2012-04-16 2016-08-25 Tencent Technology (Shenzhen) Company Limited Instruction triggering method and device, user information acquisition method and system, terminal, and server
US9788032B2 (en) 2012-05-04 2017-10-10 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
US20130342469A1 (en) * 2012-06-21 2013-12-26 Microsoft Corporation Touch intensity based on accelerometer readings
US20140035853A1 (en) * 2012-08-06 2014-02-06 Samsung Electronics Co., Ltd. Method and apparatus for providing user interaction based on multi touch finger gesture
US20140045483A1 (en) * 2012-08-08 2014-02-13 Nokia Corporation Methods, apparatuses and computer program products for automating testing of devices
US8862118B2 (en) * 2012-08-08 2014-10-14 Nokia Corporation Methods, apparatuses and computer program products for automating testing of devices
US20140089904A1 (en) * 2012-09-26 2014-03-27 Compuware Corporation Technique for simulating an orientation change event in a test environment
US9355015B2 (en) * 2012-09-26 2016-05-31 Dynatrace Llc Technique for simulating an orientation change event in a test environment
CN103810089A (en) * 2012-11-12 2014-05-21 Sap股份公司 Application of automatic test based on posture
US9342237B2 (en) * 2012-11-12 2016-05-17 Sap Se Automated testing of gesture-based applications
US20140132571A1 (en) * 2012-11-12 2014-05-15 Sap Ag Automated testing of gesture-based applications
US10671367B2 (en) 2012-12-03 2020-06-02 Apkudo, Llc System and method for analyzing user experience of a software application across disparate devices
US10261611B2 (en) * 2012-12-03 2019-04-16 Apkudo, Llc System and method for objectively measuring user experience of touch screen based devices
US10860122B2 (en) 2012-12-03 2020-12-08 Apkudo, Inc. System and method for objectively measuring user experience of touch screen based devices
US9578133B2 (en) 2012-12-03 2017-02-21 Apkudo, Llc System and method for analyzing user experience of a software application across disparate devices
US10409455B2 (en) * 2012-12-17 2019-09-10 Telecom Italia S.P.A. Selection system for an interactive display
US10452527B2 (en) 2013-03-15 2019-10-22 Apkudo, Llc System and method for facilitating field testing of a test application
US9858178B2 (en) 2013-03-15 2018-01-02 Apkudo, Llc System and method for facilitating field testing of a test application
US9367436B2 (en) 2013-03-15 2016-06-14 Apkudo, Llc System and method for coordinating field user testing results for a mobile application across various mobile devices
US9075781B2 (en) 2013-03-15 2015-07-07 Apkudo, Llc System and method for coordinating field user testing results for a mobile application across various mobile devices
US9395890B2 (en) 2013-05-15 2016-07-19 Microsoft Technology Licensing, Llc Automatic discovery of system behavior
US20160239943A1 (en) * 2013-09-13 2016-08-18 Hewlett-Packard Development Company, L.P. Screen orientation
US9710889B2 (en) * 2013-09-13 2017-07-18 Hewlett-Packard Development Company, L.P. Screen orientation
US20150088722A1 (en) * 2013-09-26 2015-03-26 Trading Technologies International, Inc. Methods and Apparatus to Implement Spin-Gesture Based Trade Action Parameter Selection
US9727915B2 (en) * 2013-09-26 2017-08-08 Trading Technologies International, Inc. Methods and apparatus to implement spin-gesture based trade action parameter selection
CN104571670A (en) * 2013-10-09 2015-04-29 纬创资通股份有限公司 Testing method of touch device and system thereof
TWI510913B (en) * 2013-11-12 2015-12-01 Inst Information Industry Testing device and testing method thereof
US9317413B2 (en) 2013-11-12 2016-04-19 Institute For Information Industry Testing device and testing method thereof
US11847315B2 (en) 2013-12-28 2023-12-19 Trading Technologies International, Inc. Methods and apparatus to enable a trading device to accept a user input
US11435895B2 (en) 2013-12-28 2022-09-06 Trading Technologies International, Inc. Methods and apparatus to enable a trading device to accept a user input
TWI510918B (en) * 2014-02-20 2015-12-01 Wistron Corp Method and system for quick testing and detectiing mobile devices
CN103853663A (en) * 2014-03-25 2014-06-11 北京金山网络科技有限公司 Application program test method and system
US9283672B1 (en) 2014-12-11 2016-03-15 Apkudo, Llc Robotic testing device and method for more closely emulating human movements during robotic testing of mobile devices
US9469037B2 (en) 2014-12-11 2016-10-18 Apkudo, Llc Robotic testing device and method for more closely emulating human movements during robotic testing of mobile devices
US9718196B2 (en) 2014-12-11 2017-08-01 Apkudo, Llc Robotic testing device and method for more closely emulating human movements during robotic testing of a user device
US10060801B2 (en) * 2015-03-23 2018-08-28 Nok9 Ab Testing device for wireless power transfer and associated method
RU2640638C2 (en) * 2015-04-24 2018-01-10 Общество С Ограниченной Ответственностью "Яндекс" Method and electronic device for e-mail message processing based on interaction with user
US20210382564A1 (en) * 2015-06-16 2021-12-09 Snap Inc. Radial gesture navigation
US11861068B2 (en) * 2015-06-16 2024-01-02 Snap Inc. Radial gesture navigation
US11093373B1 (en) * 2015-12-07 2021-08-17 Mx Technologies, Inc. Multi-platform testing automation
US11080170B1 (en) 2015-12-07 2021-08-03 Mx Technologies, Inc. Multi-platform testing automation
US11727487B2 (en) 2016-06-27 2023-08-15 Trading Technologies International, Inc. User action for continued participation in markets
US11182853B2 (en) 2016-06-27 2021-11-23 Trading Technologies International, Inc. User action for continued participation in markets
US10942641B2 (en) * 2016-09-30 2021-03-09 Sap Se Synchronized calendar and timeline adaptive user interface
US20200019292A1 (en) * 2016-09-30 2020-01-16 Sap Se Synchronized calendar and timeline adaptive user interface
US11216342B2 (en) * 2016-12-12 2022-01-04 Usablenet Inc. Methods for improved auditing of web sites and devices thereof
US20180165258A1 (en) * 2016-12-12 2018-06-14 Usablenet Inc. Methods for improved auditing of web sites and devices thereof
CN108459764A (en) * 2017-02-15 2018-08-28 霍尼韦尔国际公司 Touch detection device with code debugging device
WO2018204345A1 (en) * 2017-05-02 2018-11-08 Soroco Private Limited Systems and methods for detecting anomalies in execution of computer programs
US10789157B2 (en) 2017-05-02 2020-09-29 Soroco Private Limited Systems and methods for detecting anomalies in execution of computer programs
US10747656B2 (en) * 2018-11-27 2020-08-18 Fmr Llc Systems and methods for mobile automation testing by emulating human behaviors
US11422696B2 (en) * 2019-02-25 2022-08-23 Micro Focus Llc Representation of user interface interactive regions
US10990359B2 (en) * 2019-05-24 2021-04-27 Sap Se Use and advancements of assistive technology in automation for the visually-impaired workforce
US11087581B2 (en) * 2019-11-25 2021-08-10 Igt Correctly interpreting failed touch input using gesture input at gaming devices, and related devices, systems, and methods
CN114020202A (en) * 2021-11-05 2022-02-08 上海怿星电子科技有限公司 Electronic equipment testing method and system

Also Published As

Publication number Publication date
US20110314343A1 (en) 2011-12-22
US8966447B2 (en) 2015-02-24

Similar Documents

Publication Publication Date Title
US20110310041A1 (en) Testing a Touch-Input Program
US10318409B2 (en) Application development environment for portable electronic devices
CN109117215B (en) Self-learning robot process automation
KR102033863B1 (en) Device and method for processing touch input based on intensity
US8291408B1 (en) Visual programming environment for mobile device applications
US8479154B1 (en) Interaction with partially constructed mobile device applications
US9342237B2 (en) Automated testing of gesture-based applications
Mao et al. Robotic testing of mobile apps for truly black-box automation
KR20150047453A (en) Environment and method for cross-platform development of software applications
US20170052527A1 (en) Intelligent mobile device test fixture
CN105824755B (en) A kind of automated testing method, device and mobile terminal
EP3227785B1 (en) Playback and automatic execution of a process to control a computer system
US20120221317A1 (en) Operating system and method based on sensor data
US10152308B2 (en) User interface display testing system
US20120331411A1 (en) Cross process accessibility
US9965464B2 (en) Automatic process guidance
CN104991857B (en) Trace debug method and device
US20160162398A1 (en) Automated test generation and execution for testing a process to control a computer system
CN105339974B (en) Analog sensor
CN107765858A (en) Determine the method, apparatus, terminal and storage medium of facial angle
US20160162168A1 (en) Interaction sensing and recording of a process to control a computer system
Campillo-Sanchez et al. Simulation based software development for smart phones
US11503431B2 (en) Simulator for nearby interactions of devices
Hemalatha et al. Advancement in mobile communication using Android
CN114238113A (en) Application testing method and related device

Legal Events

Date Code Title Description
AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WILLIAMS, JOSHUA MATTHEW;GALE, JOHN D.;CREASY, MICHAEL EDWARD;AND OTHERS;SIGNING DATES FROM 20110720 TO 20110802;REEL/FRAME:026721/0738

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION