WO2007070225A1 - Alternative graphics pipe - Google Patents

Alternative graphics pipe Download PDF

Info

Publication number
WO2007070225A1
WO2007070225A1 PCT/US2006/044927 US2006044927W WO2007070225A1 WO 2007070225 A1 WO2007070225 A1 WO 2007070225A1 US 2006044927 W US2006044927 W US 2006044927W WO 2007070225 A1 WO2007070225 A1 WO 2007070225A1
Authority
WO
WIPO (PCT)
Prior art keywords
computer
accessibility
graphics pipe
application
readable medium
Prior art date
Application number
PCT/US2006/044927
Other languages
French (fr)
Inventor
Jeremy De Souza
Matthew B. Karr
Original Assignee
Microsoft Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corporation filed Critical Microsoft Corporation
Priority to JP2008545614A priority Critical patent/JP4928558B2/en
Priority to CN2006800463346A priority patent/CN101326513B/en
Priority to EP06838086A priority patent/EP1960900A4/en
Priority to KR1020087014169A priority patent/KR101331337B1/en
Priority to BRPI0618551-7A priority patent/BRPI0618551A2/en
Publication of WO2007070225A1 publication Critical patent/WO2007070225A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/008Teaching or communicating with blind persons using visual presentation of the information for the partially sighted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining

Definitions

  • Assistive technologies are software or hardware products that make software applications or operating systems accessible to individuals with a range of disabilities, such as impaired mobility, sight, hearing, etc.
  • Examples of assistive technologies include magnifiers, screen readers, and Braille displays. These products use a variety of data interception techniques throughout the operating system in order to operate.
  • assistive technologies intercept graphics primitive function calls at the display driver interface (DDI) level and use the operating system kernel state to build off-screen models. Such techniques often cause system instability and crashes.
  • DPI display driver interface
  • a graphics pipe is provided that can be called in user mode from multiple accessibility programs simultaneously and/or separately.
  • a request is received from an accessibility application to access the graphics pipe, and a connection is established.
  • the accessibility application listens to the graphics pipe for particular content of interest and builds a model based on that content.
  • the model is used to deliver content in an accessibility application to an end user appropriately. Screen captures can be performed on at least part of the content and then rendered onto another surface.
  • FIG. 1 is a diagrammatic view of parts of a graphics pipe system.
  • FIG. 2 is a diagrammatic view of a computer system of one implementation of the system of FIG. 1.
  • FIG. 3 is a diagrammatic view of an accessibility graphics pipe application operating on the computer system of FIG. 2.
  • FIG. 4 is a high-level process flow diagram for one implementation of the system of FIGS. 1 and 2.
  • FIG. 5 is a process flow diagram for one implementation of the system of FIGS. 1 and 2 illustrating the stages involved in performing screen captures on the graphics pipe and drawing the screen captures to another surface.
  • FIG. 6 is a process flow diagram for one implementation of the system of FIGS. 1 and 2 illustrating the stages involved in screen readers or Braille displays accessing the graphics pipe and building a content model.
  • FIG. 7 is a process flow diagram for one implementation of the system of FIGS. 1 and 2 illustrating the stages involved in magnifiers accessing the graphics pipe and building a content model.
  • the system may be described in the general context as an application that improves the operation of accessibility applications and their related assistive technologies, such as screen readers, screen magnifiers, and Braille displays.
  • One or more of the techniques described herein can be implemented as features within a graphics pipe application, or from any other type of program or service that facilitates accessibility scenarios.
  • a graphics pipe is provided that can be called in user mode from multiple accessibility programs simultaneously.
  • the accessibility application listens to the graphics pipe for particular content of interest and builds a model based on that content. The model is used to deliver content in an accessibility application to an end user appropriately.
  • graphics pipe system 20 includes graphics pipe 21 and accessibility applications (22, 24, and 26, respectively).
  • Graphics pipe 21 allows accessibility applications 22, 24, and/or 26 to intercept graphics primitives (e.g. geometry calls, text calls) 18, and/or information related to custom owner drawn controls 19.
  • Graphics pipe 21 serves as a central location for accessibility applications to retrieve graphic display information that can be modeled and used in rendering content (and modifying content, if appropriate) in accessibility scenarios.
  • connections through graphics pipe 21 are in user mode, instead of kernel mode, thereby providing a more reliable operating environment.
  • accessibility application 22 is coupled to graphics pipe 21 in read-only user mode over communication pathway 28, and serves as screen reader 34.
  • Accessibility application 24 is coupled to graphics pipe 21 in read and/or update user mode over communication pathway 30, and serves as a screen magnifier 36.
  • accessibility application 26 is coupled to graphics pipe 21 in read-only user mode over communication pathway 32, and serves as a Braille display.
  • screen readers and Braille displays do not need to alter the content of graphics pipe 21, so their respective connections to graphics pipe 21 are read-only. Numerous other accessibility applications and assistive technologies could be used instead of or in addition to those shown in Figure 1.
  • an exemplary computer system to use for implementing one or more parts of the system 20 includes a computing device, such as computing device 100.
  • computing device 100 typically includes at least one processing unit 102 and memory 104.
  • memory 104 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two.
  • This most basic configuration is illustrated in Figure 2 by dashed line 106.
  • device 100 may also have additional features/ functionality.
  • device 100 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Memory 104, removable storage 108 and non-removable storage 110 are all examples of computer storage media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by device 100. Any such computer storage media may be part of device 100.
  • Computing device 100 contains one or more communications interface (s) 114 that allow the device to communicate with other devices.
  • communications interface(s) 114 allows computing device 100 to communicate with one or more other computers and/or applications 115, where applicable.
  • Examples of communications interfaces are serial ports, Universal Serial Bus (USB) ports, parallel ports, wireless communication adapters, network adapters, etc.
  • Communications interface(s) 114 are used by computer 100 to exchange information such as communication media with external devices.
  • Some examples of communication media are computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired media such as a wired network or direct- wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
  • computer readable media includes both storage media and communication media.
  • Device 100 may also have input device(s) 112 such as keyboard, mouse, pen, voice input device, touch input device, etc.
  • Output device(s) 111 such as a display, screen reader, Braille display, magnifier, speakers, printer, etc. may also be included. These devices are well known in the art and need not be discussed at length here.
  • accessibility graphics pipe application 200 operating on computing device 100 is illustrated.
  • accessibility graphics pipe application 200 is included as part of the resident operating system on system memory 104, such as MICROSOFT® WINDOWS® or Linux.
  • accessibility graphics pipe application 200 is one of the application programs that reside on computing device 100.
  • one or more parts of accessibility graphics pipe application can be part computers and/or applications 115, or other such variations as would occur to one in the computer software art.
  • Accessibility graphics pipe application 200 includes business logic 204, which is responsible for carrying out some or all of the techniques described herein.
  • Business logic may include logic 206 for allowing read and/or updates to the graphics pipe by accessibility applications, logic 208 for supporting legacy content primitives, logic 210 for tagging content in the pipe with a handle to the control window, logic 212 for making off- screen content available as a bitmap, logic 214 for providing an indication that one or more assistive technologies are connected to the graphics pipe, logic 216 for allowing multiple clients to access the pipe concurrently and/or asynchronously, logic 218 for allowing owner drawn controls to be accessed through the graphics pipe, logic 220 for forcing applications to repaint upon new client connection to the graphics pipe, and other logic 222 for operating accessibility graphics pipe application 200.
  • accessibility graphics pipe application 200 resides on computing device 100. It will be understood that business logic 204 of graphics pipe application 200 can alternatively or additionally be embodied as computer-executable instructions on one or more computers and/or in different variations than shown on Figures 2 and 3. As one non-limiting example, one or more parts of business logic 204 could alternatively or additionally be implemented as a service that resides on an external computer that is called when needed. [022] Turning now to Figures 4-7 with continued reference to Figures 1-3, the stages for implementing one or more implementations of accessibility graphics pipe application 200 are described in further detail.
  • Figure 4 is a high level process flow diagram of one implementation of accessibility graphics pipe 200. In one form, the process of Figure 4 is at least partially implemented in the operating logic of computing device 100 and is executed as part of business logic 204.
  • the process begins at start point 240 with accessibility application opening a connection to graphics pipe in user mode (stage 242).
  • the graphics pipe tells all applications to repaint (stage 244) so they will have the most current content.
  • the graphics pipe provides content (stage 246), and the accessibility application listens to the pipe for that content (stage 248).
  • the content can include a client status flag indicating whether or not assistive technology is connected (stage 246), and/or the content can include off-screen content rendered in bitmaps (stage 246).
  • the accessibility application builds a model to use at least part of the pipe (stage 250).
  • the accessibility application closes the connection to the graphics pipe when finished (stage 252).
  • the stages are repeated for each accessibility application (one or more of 22, 24, and/or 26) that accesses the graphics pipe, which can be simultaneously and/or separately (stage 254).
  • stage 254 ends at end point 256.
  • FIG. 5 a process flow diagram for one implementation of the system of Figure 1 illustrates the stages involved in performing screen captures on the graphics pipe and drawing the screen captures to another surface.
  • the process of Figure 5 is at least partially implemented in the operating logic of computing device 100.
  • the process begins at start point 260 with accessibility application opening a connection to the graphics pipe in user mode (stage 262).
  • Accessibility application listens to the graphics pipe and performs screen captures on at least a portion of the content (stage 264).
  • Accessibility application draws at least some of the screen captures to another surface, such as to a file or video for vision assistance and/or training (stage 266).
  • the accessibility application then closes the connection to the graphics pipe (stage 268).
  • the process ends at end point 269.
  • FIG. 6 a process flow diagram for one implementation of the system of Figure 1 illustrates the stages involved in screen readers or Braille displays accessing the graphics pipe and building a content model.
  • the process of Figure 6 is at least partially implemented in the operating logic of computing device 100.
  • the process begins at start point 270 with a screen reader or Braille display client application opening a connection to the graphics pipe, such as in a read-only fashion (stage 272).
  • the screen reader or Braille display client application listens to the graphics pipe for relevant information (stage 274).
  • the screen reader or Braille display builds off-screen models and uses these models to output spoken voice or tactile feedback (stage 276).
  • the screen reader or Braille display client application closes the connection to the graphics pipe (stage 278).
  • the process then ends at end point 280.
  • FIG. 7 a process flow diagram for one implementation of the system of Figure 1 illustrates the stages involved in magnifiers accessing the graphics pipe and building a content model.
  • the process of Figure 7 is at least partially implemented in the operating logic of computing device 100.
  • the process begins at start point 300 with the magnification application opening a connection to the graphics pipe such as in read-only and/or update mode (stage 302).
  • the magnification application listens to the graphics pipe for relevant information (stage 304).
  • the magnification application removes the client window from magnified content, if applicable (stage 306).
  • magnification application rescales content that it obtains from the graphics pipe, such as primitives and/or surfaces (stage 308).
  • magnification application composes visuals and renders the data that have been magnified (stage 312).
  • Post-composition filtering is performed by magnification application, if applicable (stage 314).
  • magnification application closes the connection to the graphics pipe (stage 316). The process then ends at end point 318.

Abstract

Various technologies and techniques are disclosed that improve the operation of accessibility applications. A graphics pipe is provided that can be called in user mode from multiple accessibility programs. A request is received from an accessibility application to access the graphics pipe, and a connection is established. The accessibility application listens to the graphics pipe for particular content of interest and builds a model based on that content. The model is used to deliver content to an end user appropriately. Screen captures can be performed on at least part of the content and then rendered onto another surface.

Description

ALTERNATIVE GRAPHICS PIPE BACKGROUND
[001] Assistive technologies are software or hardware products that make software applications or operating systems accessible to individuals with a range of disabilities, such as impaired mobility, sight, hearing, etc. Examples of assistive technologies include magnifiers, screen readers, and Braille displays. These products use a variety of data interception techniques throughout the operating system in order to operate. Generally, assistive technologies intercept graphics primitive function calls at the display driver interface (DDI) level and use the operating system kernel state to build off-screen models. Such techniques often cause system instability and crashes.
SUMMARY
[002] Various technologies and techniques are disclosed that improve the operation of accessibility applications. A graphics pipe is provided that can be called in user mode from multiple accessibility programs simultaneously and/or separately. A request is received from an accessibility application to access the graphics pipe, and a connection is established. The accessibility application listens to the graphics pipe for particular content of interest and builds a model based on that content. The model is used to deliver content in an accessibility application to an end user appropriately. Screen captures can be performed on at least part of the content and then rendered onto another surface.
[003] This Summary was provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[004] FIG. 1 is a diagrammatic view of parts of a graphics pipe system. [005] FIG. 2 is a diagrammatic view of a computer system of one implementation of the system of FIG. 1. [006] FIG. 3 is a diagrammatic view of an accessibility graphics pipe application operating on the computer system of FIG. 2.
[007] FIG. 4 is a high-level process flow diagram for one implementation of the system of FIGS. 1 and 2. [008] FIG. 5 is a process flow diagram for one implementation of the system of FIGS. 1 and 2 illustrating the stages involved in performing screen captures on the graphics pipe and drawing the screen captures to another surface.
[009] FIG. 6 is a process flow diagram for one implementation of the system of FIGS. 1 and 2 illustrating the stages involved in screen readers or Braille displays accessing the graphics pipe and building a content model.
[010] FIG. 7 is a process flow diagram for one implementation of the system of FIGS. 1 and 2 illustrating the stages involved in magnifiers accessing the graphics pipe and building a content model.
DETAILED DESCRIPTION [011] For the purposes of promoting an understanding of the principles of the invention, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope is thereby intended. Any alterations and further modifications in the described embodiments, and any further applications of the principles as described herein are contemplated as would normally occur to one skilled in the art.
[012] The system may be described in the general context as an application that improves the operation of accessibility applications and their related assistive technologies, such as screen readers, screen magnifiers, and Braille displays. One or more of the techniques described herein can be implemented as features within a graphics pipe application, or from any other type of program or service that facilitates accessibility scenarios. As described in further detail herein, in one implementation of the system, a graphics pipe is provided that can be called in user mode from multiple accessibility programs simultaneously. In another implementation, the accessibility application listens to the graphics pipe for particular content of interest and builds a model based on that content. The model is used to deliver content in an accessibility application to an end user appropriately.
[013] As shown in Figure I5 graphics pipe system 20 includes graphics pipe 21 and accessibility applications (22, 24, and 26, respectively). Graphics pipe 21 allows accessibility applications 22, 24, and/or 26 to intercept graphics primitives (e.g. geometry calls, text calls) 18, and/or information related to custom owner drawn controls 19. Graphics pipe 21 serves as a central location for accessibility applications to retrieve graphic display information that can be modeled and used in rendering content (and modifying content, if appropriate) in accessibility scenarios. In one implementation, connections through graphics pipe 21 are in user mode, instead of kernel mode, thereby providing a more reliable operating environment.
[014] In one implementation, accessibility application 22 is coupled to graphics pipe 21 in read-only user mode over communication pathway 28, and serves as screen reader 34. Accessibility application 24 is coupled to graphics pipe 21 in read and/or update user mode over communication pathway 30, and serves as a screen magnifier 36. Similarly, accessibility application 26 is coupled to graphics pipe 21 in read-only user mode over communication pathway 32, and serves as a Braille display. In one implementation, screen readers and Braille displays do not need to alter the content of graphics pipe 21, so their respective connections to graphics pipe 21 are read-only. Numerous other accessibility applications and assistive technologies could be used instead of or in addition to those shown in Figure 1.
[015] As shown in Figure 2, an exemplary computer system to use for implementing one or more parts of the system 20 includes a computing device, such as computing device 100. In its most basic configuration, computing device 100 typically includes at least one processing unit 102 and memory 104. Depending on the exact configuration and type of computing device, memory 104 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. This most basic configuration is illustrated in Figure 2 by dashed line 106. [016] Additionally, device 100 may also have additional features/ functionality. For example, device 100 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Such additional storage is illustrated in Figure 2 by removable storage 108 and non-removable storage 110. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Memory 104, removable storage 108 and non-removable storage 110 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by device 100. Any such computer storage media may be part of device 100.
[017] Computing device 100 contains one or more communications interface (s) 114 that allow the device to communicate with other devices. For example, communications interface(s) 114 allows computing device 100 to communicate with one or more other computers and/or applications 115, where applicable. Examples of communications interfaces are serial ports, Universal Serial Bus (USB) ports, parallel ports, wireless communication adapters, network adapters, etc. Communications interface(s) 114 are used by computer 100 to exchange information such as communication media with external devices. Some examples of communication media are computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct- wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.
[018] Device 100 may also have input device(s) 112 such as keyboard, mouse, pen, voice input device, touch input device, etc. Output device(s) 111 such as a display, screen reader, Braille display, magnifier, speakers, printer, etc. may also be included. These devices are well known in the art and need not be discussed at length here.
[019] Turning now to Figure 3 with continued reference to Figure 2, an accessibility graphics pipe application 200 operating on computing device 100 is illustrated. In one implementation, accessibility graphics pipe application 200 is included as part of the resident operating system on system memory 104, such as MICROSOFT® WINDOWS® or Linux. In another embodiment, accessibility graphics pipe application 200 is one of the application programs that reside on computing device 100. Alternatively or additionally, one or more parts of accessibility graphics pipe application can be part computers and/or applications 115, or other such variations as would occur to one in the computer software art. [020] Accessibility graphics pipe application 200 includes business logic 204, which is responsible for carrying out some or all of the techniques described herein. Business logic may include logic 206 for allowing read and/or updates to the graphics pipe by accessibility applications, logic 208 for supporting legacy content primitives, logic 210 for tagging content in the pipe with a handle to the control window, logic 212 for making off- screen content available as a bitmap, logic 214 for providing an indication that one or more assistive technologies are connected to the graphics pipe, logic 216 for allowing multiple clients to access the pipe concurrently and/or asynchronously, logic 218 for allowing owner drawn controls to be accessed through the graphics pipe, logic 220 for forcing applications to repaint upon new client connection to the graphics pipe, and other logic 222 for operating accessibility graphics pipe application 200.
[021] In one implementation, accessibility graphics pipe application 200 resides on computing device 100. It will be understood that business logic 204 of graphics pipe application 200 can alternatively or additionally be embodied as computer-executable instructions on one or more computers and/or in different variations than shown on Figures 2 and 3. As one non-limiting example, one or more parts of business logic 204 could alternatively or additionally be implemented as a service that resides on an external computer that is called when needed. [022] Turning now to Figures 4-7 with continued reference to Figures 1-3, the stages for implementing one or more implementations of accessibility graphics pipe application 200 are described in further detail. It will be appreciated that some, all, or fewer of these stages can be performed, and that they can be performed in a variety of different orders than as described in Figures 4-7. Figure 4 is a high level process flow diagram of one implementation of accessibility graphics pipe 200. In one form, the process of Figure 4 is at least partially implemented in the operating logic of computing device 100 and is executed as part of business logic 204.
[023] The process begins at start point 240 with accessibility application opening a connection to graphics pipe in user mode (stage 242). In one implementation, when the accessibility application connects, the graphics pipe tells all applications to repaint (stage 244) so they will have the most current content. The graphics pipe provides content (stage 246), and the accessibility application listens to the pipe for that content (stage 248). The content can include a client status flag indicating whether or not assistive technology is connected (stage 246), and/or the content can include off-screen content rendered in bitmaps (stage 246). The accessibility application builds a model to use at least part of the pipe (stage 250). The accessibility application closes the connection to the graphics pipe when finished (stage 252). The stages are repeated for each accessibility application (one or more of 22, 24, and/or 26) that accesses the graphics pipe, which can be simultaneously and/or separately (stage 254). The process then ends at end point 256.
[024] Turning now to Figure 5, a process flow diagram for one implementation of the system of Figure 1 illustrates the stages involved in performing screen captures on the graphics pipe and drawing the screen captures to another surface. In one form, the process of Figure 5 is at least partially implemented in the operating logic of computing device 100. The process begins at start point 260 with accessibility application opening a connection to the graphics pipe in user mode (stage 262). Accessibility application listens to the graphics pipe and performs screen captures on at least a portion of the content (stage 264). Accessibility application then draws at least some of the screen captures to another surface, such as to a file or video for vision assistance and/or training (stage 266). The accessibility application then closes the connection to the graphics pipe (stage 268). The process then ends at end point 269.
[025] Turning now to Figure 6, a process flow diagram for one implementation of the system of Figure 1 illustrates the stages involved in screen readers or Braille displays accessing the graphics pipe and building a content model. In one form, the process of Figure 6 is at least partially implemented in the operating logic of computing device 100. The process begins at start point 270 with a screen reader or Braille display client application opening a connection to the graphics pipe, such as in a read-only fashion (stage 272). The screen reader or Braille display client application listens to the graphics pipe for relevant information (stage 274). The screen reader or Braille display builds off-screen models and uses these models to output spoken voice or tactile feedback (stage 276). The screen reader or Braille display client application closes the connection to the graphics pipe (stage 278). The process then ends at end point 280.
[026] Turning now to Figure 7, a process flow diagram for one implementation of the system of Figure 1 illustrates the stages involved in magnifiers accessing the graphics pipe and building a content model. In one form, the process of Figure 7 is at least partially implemented in the operating logic of computing device 100. The process begins at start point 300 with the magnification application opening a connection to the graphics pipe such as in read-only and/or update mode (stage 302). The magnification application listens to the graphics pipe for relevant information (stage 304). The magnification application removes the client window from magnified content, if applicable (stage 306). [027] Alternatively or additionally, magnification application rescales content that it obtains from the graphics pipe, such as primitives and/or surfaces (stage 308). Any pre-composed filtering is also performed if applicable (stage 310). Magnification application composes visuals and renders the data that have been magnified (stage 312). Post-composition filtering is performed by magnification application, if applicable (stage 314). When finished, magnification application closes the connection to the graphics pipe (stage 316). The process then ends at end point 318.
[028] Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. All equivalents, changes, and modifications that come within the spirit of the implementations as described herein and/or by the following claims are desired to be protected. [029] For example, a person of ordinary skill in the computer software art will recognize that the client and/or server arrangements, user interface screen content, and/or data layouts as described in the examples discussed herein could be organized differently on one or more computers to include fewer or additional options or features than as portrayed in the examples.

Claims

What is claimed is:
1. A computer-readable medium having computer-executable instructions for causing a computer to perform steps comprising: providing a graphics pipe that is operable to be called from a plurality of accessibility applications (21); receiving a request from a first accessibility application to access the graphics pipe (242); establishing a connection between the first accessibility application and the graphics pipe (242); and providing content from the graphics pipe to the first accessibility application (246).
2. The computer-readable medium of claim 1, further comprising the steps of: receiving a request from a second accessibility application to access the graphics pipe during at least a portion of a same time period as the first accessibility application (254); establishing a connection between the second accessibility application and the graphics pipe (242); and providing content from the graphics pipe to the second accessibility application (246).
3. The computer-readable medium of claim 1, wherein the graphics pipe communicates with each running application and tells each running application to repaint after the connection is established with the first accessibility application (244).
4. The computer-readable medium of claim 1 , wherein the graphics pipe is operable to be called in a user mode (242).
5. The computer-readable medium of claim 1, wherein the graphics pipe is operable to be called asynchronously (216).
6. The computer-readable medium of claim 1, wherein the accessibility application is operable to communicate with at least one output device selected from the group consisting of a screen reader, a Braille display, and a magnifier (111).
7. The computer-readable medium of claim 1 , wherein the providing content step further comprises the step of: providing a client status flag to indicate whether an assistive technology is connected to the graphics pipe at a particular moment (246).
8. The computer-readable medium of claim 1 , wherein the providing content step further comprises the step of: providing off-screen content in a bitmap format (246).
9. A computer-readable medium having computer-executable instructions for causing a computer to perform steps comprising: opening a connection to a graphics pipe from an accessibility application (262); from the accessibility application, listening to a set of content received from the graphics pipe and performing a set of screen captures on at least a portion of the content (264); drawing at least some of the screen captures to another surface (266); and closing the connection between the accessibility application and the graphics pipe (268).
10. The computer-readable medium of claim 9, wherein the screen captures are drawn to another surface for vision assistance (266).
11. The computer-readable medium of claim 9, wherein the screen captures are written to a file (266).
12. The computer-readable medium of claim 11 , wherein the screen captures are written to a file for use in a training video (266).
13. A method for using an accessibility graphics pipe comprising the steps of: opening a connection between a graphics pipe and an accessibility application in a user mode (242); from the accessibility application, listening to the graphics pipe for relevant information (248); and from the accessibility application, building an off-screen model with at least some of the information (250).
14. The method of claim 13, wherein the off-screen model is used to output a spoken voice (276).
15. The method of claim 13, wherein the off-screen model is used to output tactile feedback (276).
16. The method of claim 13 , wherein the off-screen model is used to output tactile feedback (276).
17. The method of claim 13, wherein the accessibility application is a screen reader (274).
18. The method of claim 13 , wherein the accessibility application is a Braille provider (272).
19. The method of claim 13, wherein the accessibility application is a magnifier (302).
20. A computer-readable medium having computer-executable instructions for causing a computer to perform the steps recited in claim 13 (200).
PCT/US2006/044927 2005-12-12 2006-11-17 Alternative graphics pipe WO2007070225A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2008545614A JP4928558B2 (en) 2005-12-12 2006-11-17 Alternative graphics pipe
CN2006800463346A CN101326513B (en) 2005-12-12 2006-11-17 Method for utilizing alternative graphics pipe
EP06838086A EP1960900A4 (en) 2005-12-12 2006-11-17 Alternative graphics pipe
KR1020087014169A KR101331337B1 (en) 2005-12-12 2006-11-17 Alternative graphics pipe
BRPI0618551-7A BRPI0618551A2 (en) 2005-12-12 2006-11-17 alternate graphic redirection

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/299,535 US7773096B2 (en) 2005-12-12 2005-12-12 Alternative graphics pipe
US11/299,535 2005-12-12

Publications (1)

Publication Number Publication Date
WO2007070225A1 true WO2007070225A1 (en) 2007-06-21

Family

ID=38138816

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/044927 WO2007070225A1 (en) 2005-12-12 2006-11-17 Alternative graphics pipe

Country Status (8)

Country Link
US (1) US7773096B2 (en)
EP (1) EP1960900A4 (en)
JP (1) JP4928558B2 (en)
KR (1) KR101331337B1 (en)
CN (1) CN101326513B (en)
BR (1) BRPI0618551A2 (en)
RU (1) RU2433462C2 (en)
WO (1) WO2007070225A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7773096B2 (en) * 2005-12-12 2010-08-10 Microsoft Corporation Alternative graphics pipe
US8005013B2 (en) * 2007-06-12 2011-08-23 Hewlett-Packard Development Company, L.P. Managing connectivity in a virtual network
US8209707B2 (en) * 2008-01-11 2012-06-26 Google Inc. Gathering state information for an application and kernel components called by the application
KR101681644B1 (en) 2011-10-18 2016-12-02 삼성디스플레이 주식회사 Liquid crystal display and manufacturing method thereof
GR1008064B (en) * 2012-10-11 2013-12-18 Εθνικο Κεντρο Ερευνας Και Τεχνολογικης Αναπτυξης (Ε.Κ.Ε.Τ.Α)/Ινστιτουτο Βιωσιμης Κινητικοτητας Και Δικτυων, Method of transferring messages regarding the current state of the graphical environment of a software application on terminals
CN104995922B (en) * 2013-03-14 2019-07-02 英特尔公司 For the device of personal broadcaster, method and storage medium
US9734312B1 (en) * 2015-08-12 2017-08-15 Symantec Corporation Systems and methods for detecting when users are uninstalling applications
US9864949B1 (en) 2016-10-31 2018-01-09 International Business Machines Corporation Cognition model-based product assist

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030117441A1 (en) * 2001-12-21 2003-06-26 Walls Jeffrey J. System and method for configuring graphics pipelines in a computer graphical display system
US20030128216A1 (en) * 2001-12-21 2003-07-10 Walls Jeffrey J. System and method for automatically configuring graphics pipelines by tracking a region of interest in a computer graphical display system
US20050166214A1 (en) * 2002-07-29 2005-07-28 Silicon Graphics, Inc. System and method for managing graphics applications

Family Cites Families (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH086493A (en) * 1993-07-21 1996-01-12 Texas Instr Inc <Ti> Tangible-type display that can be electronically refreshed for braille text and braille diagram
JP3236180B2 (en) * 1994-12-05 2001-12-10 日本電気株式会社 Coordinate pointing device
JPH09258946A (en) * 1996-03-26 1997-10-03 Fujitsu Ltd Information processor
US5754348A (en) * 1996-05-14 1998-05-19 Planetweb, Inc. Method for context-preserving magnification of digital image regions
JPH1083269A (en) * 1996-09-09 1998-03-31 Nec Corp User interface converting device
US6225920B1 (en) * 1997-04-14 2001-05-01 Randy A. Dayle Portable computer apparatus for assisting persons with cognitive disabilities
US7596755B2 (en) * 1997-12-22 2009-09-29 Ricoh Company, Ltd. Multimedia visualization and integration environment
CA2276636A1 (en) 1998-06-30 1999-12-30 Sun Microsystems, Inc. Consistent and uniform programming interface arrangement and method for enabling "assistive technology" programs to obtain information from and to control graphical user interfaceobjects
JP3831538B2 (en) * 1998-11-26 2006-10-11 インターナショナル・ビジネス・マシーンズ・コーポレーション Power saving method and apparatus for display
US6546431B1 (en) * 1999-03-12 2003-04-08 International Business Machines Corporation Data processing system and method for sharing user interface devices of a provider assistive technology application with disparate user assistive technology applications
GB2352313B (en) 1999-07-21 2003-10-15 Ncr Int Inc Transaction system
US6538660B1 (en) * 1999-11-12 2003-03-25 International Business Machines Corporation Method, system, and program for superimposing data from different application programs
US6829746B1 (en) * 1999-12-09 2004-12-07 International Business Machines Corp. Electronic document delivery system employing distributed document object model (DOM) based transcoding
TW495716B (en) * 2000-01-21 2002-07-21 Dream Technologies Corp Control device and method for starting computer application software and multi-monitor computer, client-server system, and memory media thereof
GB0003311D0 (en) * 2000-02-15 2000-04-05 Koninkl Philips Electronics Nv Autostereoscopic display driver
US6891533B1 (en) * 2000-04-11 2005-05-10 Hewlett-Packard Development Company, L.P. Compositing separately-generated three-dimensional images
JP2002196732A (en) * 2000-04-27 2002-07-12 Toshiba Corp Display device, picture control semiconductor device, and method for driving the display device
US20020091991A1 (en) * 2000-05-11 2002-07-11 Castro Juan Carlos Unified real-time microprocessor computer
US7119809B1 (en) * 2000-05-15 2006-10-10 S3 Graphics Co., Ltd. Parallel architecture for graphics primitive decomposition
US6918091B2 (en) * 2000-11-09 2005-07-12 Change Tools, Inc. User definable interface system, method and computer program product
US20020155419A1 (en) * 2001-04-19 2002-10-24 International Business Machines Corporation Customizable online testing for people with disabilities
US6981246B2 (en) * 2001-05-15 2005-12-27 Sun Microsystems, Inc. Method and apparatus for automatic accessibility assessment
US20040148568A1 (en) * 2001-06-13 2004-07-29 Springer Timothy Stephen Checker and fixer algorithms for accessibility standards
US6802055B2 (en) * 2001-06-27 2004-10-05 Microsoft Corporation Capturing graphics primitives associated with any display object rendered to a graphical user interface
JP3746211B2 (en) * 2001-08-03 2006-02-15 株式会社ソニー・コンピュータエンタテインメント Drawing apparatus, drawing method, drawing program, computer-readable recording medium recording the drawing program, and graphics processor
US6931151B2 (en) * 2001-11-21 2005-08-16 Intel Corporation Method and apparatus for modifying graphics content prior to display for color blind use
US7352356B2 (en) * 2001-12-13 2008-04-01 United States Of America Refreshable scanning tactile graphic display for localized sensory stimulation
US6784905B2 (en) * 2002-01-22 2004-08-31 International Business Machines Corporation Applying translucent filters according to visual disability needs
US6876369B2 (en) * 2002-01-22 2005-04-05 International Business Machines Corp. Applying translucent filters according to visual disability needs in a network environment
US6909432B2 (en) * 2002-02-27 2005-06-21 Hewlett-Packard Development Company, L.P. Centralized scalable resource architecture and system
US7093199B2 (en) * 2002-05-07 2006-08-15 International Business Machines Corporation Design environment to facilitate accessible software
US6889337B1 (en) * 2002-06-03 2005-05-03 Oracle International Corporation Method and system for screen reader regression testing
US7168049B2 (en) * 2002-06-18 2007-01-23 Silicon Graphics, Inc. System and method for allocating computing resources
US6982682B1 (en) * 2002-07-29 2006-01-03 Silicon Graphics, Inc. System and method for managing graphics applications
US7287984B2 (en) * 2002-10-15 2007-10-30 Techenable, Inc. System and method for providing a visual language for non-reading sighted persons
US20040218451A1 (en) * 2002-11-05 2004-11-04 Said Joe P. Accessible user interface and navigation system and method
US6990491B2 (en) * 2002-12-12 2006-01-24 International Business Machines Corporation System and method for accessibility data maintenance and privilege authorization
US20040139370A1 (en) * 2003-01-14 2004-07-15 Dan Bailey Source code analysis
US7119808B2 (en) * 2003-07-15 2006-10-10 Alienware Labs Corp. Multiple parallel processor computer graphics system
US20050015255A1 (en) * 2003-07-18 2005-01-20 Pitney Bowes Incorporated Assistive technology for disabled people and others utilizing a remote service bureau
US7857138B2 (en) * 2003-12-02 2010-12-28 Mary Darlene Temple Apparatus and method for delivery of medication
US20050233287A1 (en) * 2004-04-14 2005-10-20 Vladimir Bulatov Accessible computer system
US8196104B2 (en) * 2005-08-31 2012-06-05 Sap Ag Systems and methods for testing application accessibility
US7773096B2 (en) * 2005-12-12 2010-08-10 Microsoft Corporation Alternative graphics pipe

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030117441A1 (en) * 2001-12-21 2003-06-26 Walls Jeffrey J. System and method for configuring graphics pipelines in a computer graphical display system
US20030128216A1 (en) * 2001-12-21 2003-07-10 Walls Jeffrey J. System and method for automatically configuring graphics pipelines by tracking a region of interest in a computer graphical display system
US20040104913A1 (en) * 2001-12-21 2004-06-03 Walls Jeffrey J. System and method for automatically configuring graphics pipelines by tracking a region of interest in a computer graphical display system
US20050166214A1 (en) * 2002-07-29 2005-07-28 Silicon Graphics, Inc. System and method for managing graphics applications

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1960900A4 *

Also Published As

Publication number Publication date
CN101326513A (en) 2008-12-17
US20070132753A1 (en) 2007-06-14
EP1960900A1 (en) 2008-08-27
RU2433462C2 (en) 2011-11-10
JP2009519542A (en) 2009-05-14
BRPI0618551A2 (en) 2011-09-06
EP1960900A4 (en) 2009-01-07
US7773096B2 (en) 2010-08-10
KR101331337B1 (en) 2013-11-22
CN101326513B (en) 2012-05-23
KR20080076938A (en) 2008-08-20
RU2008123838A (en) 2009-12-27
JP4928558B2 (en) 2012-05-09

Similar Documents

Publication Publication Date Title
US7773096B2 (en) Alternative graphics pipe
US7185116B2 (en) Template-based customization of a user interface for a messaging application program
US20090044112A1 (en) Animated Digital Assistant
CN108388650A (en) Need-based search processing method, device and smart machine
CN107423055A (en) Method, apparatus, equipment and the storage medium of adaptive terminal device resolution
CN105094824A (en) Display method for notification messages on intelligent watch and intelligent watch
CN108008876A (en) A kind of display methods of suspension windows, device, equipment and storage medium
US7567253B2 (en) Mirror driver notification of device independent bitmap drawing calls
CN113378855A (en) Method for processing multitask, related device and computer program product
CN110287384B (en) Intelligent service method, device and equipment
CN107862035A (en) Network read method, device, Intelligent flat and the storage medium of minutes
CN107301220A (en) Method, device, equipment and the storage medium of data-driven view
CN116107680A (en) Operation guiding method and device of mobile terminal and electronic equipment
CN111054072B (en) Method, device, equipment and storage medium for role model tailing
US11417093B1 (en) Image capture with context data overlay
US10657692B2 (en) Determining image description specificity in presenting digital content
KR20180076188A (en) Method for embodiment of augmented reality using marker and Voice Recognition
CN113313623A (en) Watermark information display method, watermark information display device, electronic equipment and computer readable medium
CN111860214A (en) Face detection method, training method and device of model thereof and electronic equipment
CN111161737A (en) Data processing method and device, electronic equipment and storage medium
CN110327626A (en) Virtual server creation method and device
CN115695635B (en) Operation prompting method, storage medium and electronic equipment
CN116628153B (en) Method, device, equipment and medium for controlling dialogue of artificial intelligent equipment
US20240038223A1 (en) Speech recognition method and apparatus
KR20010091193A (en) System for editing graphic data on the web

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200680046334.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2373/CHENP/2008

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: MX/a/2008/006619

Country of ref document: MX

REEP Request for entry into the european phase

Ref document number: 2006838086

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2006838086

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2008123838

Country of ref document: RU

WWE Wipo information: entry into national phase

Ref document number: 2008545614

Country of ref document: JP

Ref document number: 1020087014169

Country of ref document: KR

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: PI0618551

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20080513