US20020175911A1 - Selecting a target object in three-dimensional space - Google Patents

Selecting a target object in three-dimensional space Download PDF

Info

Publication number
US20020175911A1
US20020175911A1 US09/863,046 US86304601A US2002175911A1 US 20020175911 A1 US20020175911 A1 US 20020175911A1 US 86304601 A US86304601 A US 86304601A US 2002175911 A1 US2002175911 A1 US 2002175911A1
Authority
US
United States
Prior art keywords
objects
link
virtual
dimensional space
link object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/863,046
Inventor
John Light
Michael Smith
John Miller
Sunil Kasturi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corp filed Critical Intel Corp
Priority to US09/863,046 priority Critical patent/US20020175911A1/en
Assigned to INTEL CORPORATION reassignment INTEL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KASTURI, SUNIL, LIGHT, JOHN J., MILLER, JOHN D., SMITH, MICHAEL D.
Publication of US20020175911A1 publication Critical patent/US20020175911A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Definitions

  • This invention relates to selecting a target object in virtual three-dimensional (3D) space.
  • a virtual 3D space includes objects that are either link objects or non-link objects.
  • Non-link objects represent data, such as Microsoft® Word® documents.
  • Link objects connect non-link objects to one another. That is, link objects represent the relationship of one non-link object to another non-link object.
  • a “table of contents” i.e., a non-link object
  • FIG. 1 is a view of a target object in virtual 3D space.
  • FIG. 2 is a 3D view of objects in the 3D space.
  • FIG. 3 is a view of a link object with an extended area.
  • FIG. 4 is a view of a link object with an extended area after the process of FIG. 5 is executed.
  • FIG. 5 is flowchart of a process for selecting a target object in 3D space.
  • FIG. 6 is a block diagram of a computer system on which the process of FIG. 5 may be implemented.
  • FIG. 1 shows objects in a 3D environment.
  • the objects include non-link objects, such as objects 6 , and link objects, such as objects 3 .
  • Non-link objects represent data.
  • the data can be a computer file or any defined set of information. For example, a word processing document, a set of computer instructions, or a list of information could all be represented by non-link object 6 .
  • a user may select a non-link object 6 in order to access data located within the non-link object or to manipulate its location and properties.
  • the selected non-link object is referred to herein as “target object 4 ”.
  • a user selects target object 4 by moving a cursor over the object in 3D space and pressing a key on a keyboard or mouse.
  • An object may be selected for a number of reasons. For example, a user may want to change the name of the file represented by the object or to open the file. Once selected, the user has access to the file to make any necessary changes.
  • FIG. 1 also shows link objects 3 .
  • Link objects 3 may be depicted as lines or curves.
  • Link objects 3 represent relationships between non-link objects 6 .
  • An association between a target object 4 and a non-link object 6 a is formed by connecting a first end 10 of link object 3 a to target object 4 and a second end 11 of link object 3 a to non-link object 6 a .
  • target object 4 may represent a directory of files on a personal computer and non-link object 6 a may represent a file located within the directory.
  • Several link objects 3 may connect to a single non-link object 6 , as shown in FIG. 1.
  • a link object 3 may be positioned in front of a non-link object 6 .
  • a user may want to select a link object 3 to change relationships among non-link objects 6 .
  • Link object 3 may be selected anywhere along link object 3 , e.g., from end 10 to end 11 .
  • a virtual camera 50 is located at an arbitrary point in the virtual 3D space.
  • a distance (depth) 56 is measured from camera 50 to an object 51 .
  • Distance (depth) 58 is the distance between camera 50 and object 52 .
  • Distance 56 is shorter than difference 58 . Accordingly, object 51 is considered closer to camera 50 than object 52 in the virtual 3D space.
  • the closer object to the camera 50 is selected. The exception to the general rule is described below.
  • an extended area 9 represents a tolerance area for each link object. This tolerance area is generally on the order of several pixels around the link. Selecting a pixel in the tolerance area causes the corresponding link also to be selected. This can result in erroneous object selection, as explained below.
  • extended area 9 the user does not see extended area 9 , i.e., it is not displayed on screen.
  • a link object 3 a can be placed in front of target object 4 . Since the area from which to select target object 4 is only approximately a few pixels high and wide centered around the cursor point, extended area 9 can interfere with the selection of target object 4 . Thus, a user who attempts to select target object 4 could not select target object 4 . This is because extended area 9 of link object 3 is closer to camera 50 than target object 4 . As a result, attempting to select target object 4 , e.g., at point 7 , would actually cause link 3 a to be selected and not target object 4 .
  • Process 20 for preventing extended area 9 from extending over target object 4 and obstructing a user's ability to select target object 4 .
  • Process 20 takes into account whether an object is a link object 3 or non-link object 6 during selection, as described in detail below.
  • process 20 gives precedence to non-link objects that are obscured by link objects by a predetermined number of pixels in the z-direction of 3D space.
  • the effective result of process 20 is shown in FIG. 4. That is, for all intents and purposes, the extended areas 9 over non-link object 4 are ignored, allowing the user to select target object 4 relatively easily.
  • process 20 receives ( 21 ) coordinates based on user input. For example, a user may select target object 4 by pointing and clicking using a mouse.
  • Process 20 locates ( 22 ) the objects in 3D space under the cursor at the user-selected coordinates. Because two or more objects, including the extended area, may overlap in the z-direction, more than one object may be located at the user-selected coordinates. This is because user-selection is made in the x-y plane of the computer screen.
  • Process 20 obtains object characteristics for each selected object. Those characteristics include the position of the object and its type. The type of each object may include information such as whether an object is a non-link object or a link object. The position of each object may be the object's xyz coordinates.
  • Process 20 identifies ( 23 ) the selected objects by analyzing the objects' characteristics.
  • Process 20 labels each object, including target object 4 , based on whether each object is a link object or a non-link object. For example, process 20 labels link objects as link objects and non-link objects as non-link objects.
  • Process 20 determines ( 24 ) distances between the user-selected objects and camera 50 using the positions for each object. Process 20 does this by taking the difference between coordinates of locations of the various objects. For example, referring to FIG. 2, the distance 56 between camera 50 and object 51 is the difference in the Cartesian xyz coordinate values of camera 50 and object 51 .
  • Process 20 prioritizes ( 25 ) the objects based on the objects' distance from one another and the identities of the objects Object priorities may be stored in a list in memory and retrieved by process 20 when necessary.
  • link objects 3 are given a lower priority than non-link objects 6 .
  • a non-link object 6 and a link object 3 may have the same distance (depth) relative to camera 50 .
  • Process 20 nevertheless assigns non-link object 6 a higher priority than link object 3 .
  • link object 3 is actually closer to camera 50 than target object 4 .
  • process 20 gives priority to non-link object 6 only if it is less than a certain distance (depth) from link object 3 relative to camera 50 . Otherwise, process 20 gives priority to link object 3 .
  • link object 3 a to be selected, for example, link object 3 a must be closer to camera 50 than target object 4 by a predetermined distance.
  • Process 20 selects ( 26 ) target object 4 from among the objects using the stored priorities for the objects.
  • target object 4 and link object 3 a via its extended area
  • Process 20 will select object 4 if (1) it is a non-link object and (2) object 4 is less than a predetermined distance (i.e., number of pixels) behind any overlapping link object 3 . The distances are determined with respect to camera 50 . If these two criteria are not met, process 20 will select link object 3 a.
  • an OpenGL depth buffer provides information to prioritize selection of objects.
  • all objects under the cursor are tagged with depth information normalized between 0 and 0 xffffff, front to back.
  • non-link objects 6 have priority over link objects 3 up to a predetermined depth difference. In this example, for a link object 3 to be selected there must be a depth difference of less than 0x1000000 between the link and non-link objects.
  • FIG. 6 shows a computer 30 for selecting a target object 4 using process 20 .
  • Computer 30 includes a processor 33 , a memory 39 , a storage medium 41 (e.g. hard disk), and a 3D graphics processor 41 for processing data in 3D space of FIGS. 1 - 4 .
  • Storage medium 41 stores the 3D data 44 which defines the 3D space, and computer instructions 42 which are executed by processor 33 out of memory 39 to select a target object using process 20 .
  • Process 20 is not limited to use with the hardware and software of FIG. 6; it may find applicability in any computing or processing environment and with any type of machine that is capable of running a computer program.
  • Process 20 may be implemented in hardware, software, or a combination of the two.
  • Process 20 may be implemented in computer programs executed on programmable computers that each include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices.
  • Program code maybe applied to data entered using an input device to perform process 20 and to generate output information.
  • Each such program may be implemented in a high level procedural or objected-oriented programming language to communicate with a computer system.
  • the programs can be implemented in assembly or machine language.
  • the language may be a compiled or an interpreted language.
  • Each computer program may be stored on a storage medium or device (e.g., CD-ROM, hard disk, or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer to perform process 20 .
  • Process 20 may also be implemented as a computer-readable storage medium, configured with a computer program, where upon execution, instructions in the computer program cause the computer to operate in accordance with process 20 .
  • the invention is not limited to the specific embodiments described herein.
  • the invention can prioritize non-link and link objects differently, e.g., give general priority to link objects over non-link objects.
  • the invention can be used with objects other than non-link objects and link objects.
  • the invention is also not limited to use in 3D space, but rather can be used in N-dimensional space (N ⁇ 3).
  • the invention is not limited to the specific processing order of FIG. 5. Rather, the specific blocks of FIG. 5 may be re-ordered, as necessary, to achieve the results set forth above.

Abstract

A target object is selected in a virtual three-dimensional space by identifying objects, including the target object, in the virtual three-dimensional space, determining distances between the objects and a point in the virtual three-dimensional space, prioritizing the objects based on distances and identities of the objects, and selecting the target object from among the objects based on priority.

Description

    TECHNICAL FIELD
  • This invention relates to selecting a target object in virtual three-dimensional (3D) space. [0001]
  • BACKGROUND
  • A virtual 3D space includes objects that are either link objects or non-link objects. Non-link objects represent data, such as Microsoft® Word® documents. Link objects connect non-link objects to one another. That is, link objects represent the relationship of one non-link object to another non-link object. For example, a “table of contents” (i.e., a non-link object) may contain links to several documents referenced in the table.[0002]
  • DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a view of a target object in virtual 3D space. [0003]
  • FIG. 2 is a 3D view of objects in the 3D space. [0004]
  • FIG. 3 is a view of a link object with an extended area. [0005]
  • FIG. 4 is a view of a link object with an extended area after the process of FIG. 5 is executed. [0006]
  • FIG. 5 is flowchart of a process for selecting a target object in 3D space. [0007]
  • FIG. 6 is a block diagram of a computer system on which the process of FIG. 5 may be implemented.[0008]
  • DESCRIPTION
  • FIG. 1 shows objects in a 3D environment. The objects include non-link objects, such as [0009] objects 6, and link objects, such as objects 3. Non-link objects represent data. The data can be a computer file or any defined set of information. For example, a word processing document, a set of computer instructions, or a list of information could all be represented by non-link object 6.
  • A user may select a [0010] non-link object 6 in order to access data located within the non-link object or to manipulate its location and properties. The selected non-link object is referred to herein as “target object 4”. A user selects target object 4 by moving a cursor over the object in 3D space and pressing a key on a keyboard or mouse. An object may be selected for a number of reasons. For example, a user may want to change the name of the file represented by the object or to open the file. Once selected, the user has access to the file to make any necessary changes.
  • FIG. 1 also shows [0011] link objects 3. Link objects 3 may be depicted as lines or curves. Link objects 3 represent relationships between non-link objects 6. An association between a target object 4 and a non-link object 6 a is formed by connecting a first end 10 of link object 3 a to target object 4 and a second end 11 of link object 3 a to non-link object 6 a. For example, target object 4 may represent a directory of files on a personal computer and non-link object 6 a may represent a file located within the directory. Several link objects 3 may connect to a single non-link object 6, as shown in FIG. 1.
  • In the virtual 3D space of FIG. 1, a [0012] link object 3 may be positioned in front of a non-link object 6. A user may want to select a link object 3 to change relationships among non-link objects 6. Link object 3 may be selected anywhere along link object 3, e.g., from end 10 to end 11.
  • Referring to FIG. 2, a [0013] virtual camera 50 is located at an arbitrary point in the virtual 3D space. A distance (depth) 56 is measured from camera 50 to an object 51. Distance (depth) 58 is the distance between camera 50 and object 52. Distance 56 is shorter than difference 58. Accordingly, object 51 is considered closer to camera 50 than object 52 in the virtual 3D space. Generally speaking, when placing a cursor on two objects, the closer object to the camera 50 is selected. The exception to the general rule is described below.
  • Referring to FIG. 3, an [0014] extended area 9 represents a tolerance area for each link object. This tolerance area is generally on the order of several pixels around the link. Selecting a pixel in the tolerance area causes the corresponding link also to be selected. This can result in erroneous object selection, as explained below.
  • More specifically, the user does not see [0015] extended area 9, i.e., it is not displayed on screen. As shown in FIG. 3, a link object 3 a can be placed in front of target object 4. Since the area from which to select target object 4 is only approximately a few pixels high and wide centered around the cursor point, extended area 9 can interfere with the selection of target object 4. Thus, a user who attempts to select target object 4 could not select target object 4. This is because extended area 9 of link object 3 is closer to camera 50 than target object 4. As a result, attempting to select target object 4, e.g., at point 7, would actually cause link 3 a to be selected and not target object 4.
  • Referring to FIG. 5, a [0016] process 20 is shown for preventing extended area 9 from extending over target object 4 and obstructing a user's ability to select target object 4. Process 20 takes into account whether an object is a link object 3 or non-link object 6 during selection, as described in detail below.
  • Briefly, [0017] process 20 gives precedence to non-link objects that are obscured by link objects by a predetermined number of pixels in the z-direction of 3D space. The effective result of process 20 is shown in FIG. 4. That is, for all intents and purposes, the extended areas 9 over non-link object 4 are ignored, allowing the user to select target object 4 relatively easily.
  • In more detail, [0018] process 20 receives (21) coordinates based on user input. For example, a user may select target object 4 by pointing and clicking using a mouse. Process 20 locates (22) the objects in 3D space under the cursor at the user-selected coordinates. Because two or more objects, including the extended area, may overlap in the z-direction, more than one object may be located at the user-selected coordinates. This is because user-selection is made in the x-y plane of the computer screen. Process 20 obtains object characteristics for each selected object. Those characteristics include the position of the object and its type. The type of each object may include information such as whether an object is a non-link object or a link object. The position of each object may be the object's xyz coordinates.
  • [0019] Process 20 identifies (23) the selected objects by analyzing the objects' characteristics. Process 20 labels each object, including target object 4, based on whether each object is a link object or a non-link object. For example, process 20 labels link objects as link objects and non-link objects as non-link objects. Process 20 determines (24) distances between the user-selected objects and camera 50 using the positions for each object. Process 20 does this by taking the difference between coordinates of locations of the various objects. For example, referring to FIG. 2, the distance 56 between camera 50 and object 51 is the difference in the Cartesian xyz coordinate values of camera 50 and object 51.
  • [0020] Process 20 prioritizes (25) the objects based on the objects' distance from one another and the identities of the objects Object priorities may be stored in a list in memory and retrieved by process 20 when necessary. Generally, link objects 3 are given a lower priority than non-link objects 6. For example, a non-link object 6 and a link object 3 may have the same distance (depth) relative to camera 50. Process 20 nevertheless assigns non-link object 6 a higher priority than link object 3.
  • In another case, such as that shown in FIG. 3, link [0021] object 3 is actually closer to camera 50 than target object 4. In this case, process 20 gives priority to non-link object 6 only if it is less than a certain distance (depth) from link object 3 relative to camera 50. Otherwise, process 20 gives priority to link object 3. Thus, for link object 3 a to be selected, for example, link object 3 a must be closer to camera 50 than target object 4 by a predetermined distance.
  • [0022] Process 20 selects (26) target object 4 from among the objects using the stored priorities for the objects. By way of example, assume that target object 4 and link object 3 a (via its extended area) are both “clicked on” by the user. Process 20 will select object 4 if (1) it is a non-link object and (2) object 4 is less than a predetermined distance (i.e., number of pixels) behind any overlapping link object 3. The distances are determined with respect to camera 50. If these two criteria are not met, process 20 will select link object 3 a.
  • In one embodiment, an OpenGL depth buffer provides information to prioritize selection of objects. Using OpenGL, all objects under the cursor are tagged with depth information normalized between 0 and [0023] 0 xfffffff, front to back. As described above, non-link objects 6 have priority over link objects 3 up to a predetermined depth difference. In this example, for a link object 3 to be selected there must be a depth difference of less than 0x1000000 between the link and non-link objects.
  • FIG. 6 shows a computer [0024] 30 for selecting a target object 4 using process 20. Computer 30 includes a processor 33, a memory 39, a storage medium 41 (e.g. hard disk), and a 3D graphics processor 41 for processing data in 3D space of FIGS. 1-4. Storage medium 41 stores the 3D data 44 which defines the 3D space, and computer instructions 42 which are executed by processor 33 out of memory 39 to select a target object using process 20.
  • [0025] Process 20 is not limited to use with the hardware and software of FIG. 6; it may find applicability in any computing or processing environment and with any type of machine that is capable of running a computer program. Process 20 may be implemented in hardware, software, or a combination of the two. Process 20 may be implemented in computer programs executed on programmable computers that each include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices. Program code maybe applied to data entered using an input device to perform process 20 and to generate output information.
  • Each such program may be implemented in a high level procedural or objected-oriented programming language to communicate with a computer system. However, the programs can be implemented in assembly or machine language. The language may be a compiled or an interpreted language. Each computer program may be stored on a storage medium or device (e.g., CD-ROM, hard disk, or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer to perform [0026] process 20. Process 20 may also be implemented as a computer-readable storage medium, configured with a computer program, where upon execution, instructions in the computer program cause the computer to operate in accordance with process 20.
  • The invention is not limited to the specific embodiments described herein. For example, the invention can prioritize non-link and link objects differently, e.g., give general priority to link objects over non-link objects. The invention can be used with objects other than non-link objects and link objects. The invention is also not limited to use in 3D space, but rather can be used in N-dimensional space (N≧3). The invention is not limited to the specific processing order of FIG. 5. Rather, the specific blocks of FIG. 5 may be re-ordered, as necessary, to achieve the results set forth above. [0027]
  • Other embodiments not described herein are also within the scope of the following claims. [0028]

Claims (24)

What is claimed is:
1. A method of selecting a target object in virtual three-dimensional space, comprising:
identifying objects, including the target object, in the virtual three-dimensional space;
determining distances between the objects and a point in the virtual three-dimensional space;
prioritizing the objects based on distances and identities of the objects; and
selecting the target object from among the objects based on priority.
2. The method of claim 1, wherein the objects comprise one or more of a link object and non-link object.
3. The method of claim 2, wherein prioritizing comprises assigning a higher priority to the non-link objects than to the link objects if the distances meet a predetermined criterion.
4. The method of claim 1 wherein:
the objects include a link object; and
prioritizing comprises assigning higher priority to the link object if the link object is closer to the point than a non-link object by a predetermined distance.
5. The method of claim 4, wherein the predetermined distance comprises 0x1000000.
6. The method of claim 1, wherein identifying comprises distinguishing between a link object and a non-link object.
7. The method of claim 1, further comprising:
receiving coordinates based on a user input; and
locating the objects in the virtual three-dimensional space based on the coordinates.
8. The method of claim 1, wherein determining the distances comprises obtaining differences between coordinates in the virtual three-dimensional space for the objects and coordinates in the virtual three-dimensional space for the point.
9. An apparatus for selecting a target object in virtual three-dimensional space, comprising:
a memory that stores executable instructions; and
a processor that executes the instructions to:
identify objects, including the target object, in the virtual three-dimensional space;
determine distances between the objects and a point in the virtual three-dimensional space;
prioritize the objects based on distances and identities of the objects; and
select the target object from among the objects based on priority.
10. The apparatus of claim 10, wherein the objects comprise one or more of a link object and non-link object.
11. The apparatus of claim 9, wherein prioritizing comprises assigning a higher priority to the non-link objects than to the link objects if the distances meet a predetermined criterion.
12. The apparatus of claim 9, wherein:
the objects include a link object; and
prioritizing comprises assigning higher priority to the link object if the link object is closer to the point than a non-link object by a predetermined distance.
13. The apparatus of claim 12, wherein the predetermined distance comprises 0x1000000.
14. The apparatus of claim 9, wherein identifying comprises distinguishing between a link object and non-link object.
15. The apparatus of claim 9, wherein the processor executes instructions to:
receive coordinates based on a user input; and
locate the objects in the virtual three-dimensional space based on the coordinates.
16. The apparatus of claim 9, wherein determining the distances comprises obtaining differences between coordinates in the virtual three-dimensional space for the objects and coordinates in the virtual three dimensional space for the point.
17. An article comprising a computer-readable medium that stores executable instructions for selecting a target object in virtual three-dimensional space, the instructions causing a machine to:
identify objects, including the target object, in the virtual three-dimensional space;
determine distances between the objects and a point in the virtual three-dimensional space;
prioritize the objects based on distances and identities of the objects; and
select the target object from among the objects based on priority.
18. The article of claim 17, wherein the objects comprise one or more of a link object and non-link object.
19. The article of claim 18, wherein prioritizing comprises assigning a higher priority to the non-link objects than to the link objects if the distances meet a predetermined criterion.
20. The article of claim 17, wherein:
the objects include a link object; and
prioritizing comprises assigning higher priority to the link object if the link object is closer to the point than a non-link object by a predetermined distance.
21. The article of claim 20, wherein the predetermined distance comprises 0x1000000.
22. The article of claim 17, wherein identifying comprises distinguishing between a link object and a non-link object.
23. The article of claim 17, wherein the article further comprises instructions to:
receive coordinates based on a user input; and
locate the objects in the virtual three-dimensional space based on the coordinates.
24. The article of claim 17 wherein determining the distances comprises obtaining differences between coordinates in the virtual three-dimensional space for the objects and coordinates in the virtual three-dimensional space for the point.
US09/863,046 2001-05-22 2001-05-22 Selecting a target object in three-dimensional space Abandoned US20020175911A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/863,046 US20020175911A1 (en) 2001-05-22 2001-05-22 Selecting a target object in three-dimensional space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/863,046 US20020175911A1 (en) 2001-05-22 2001-05-22 Selecting a target object in three-dimensional space

Publications (1)

Publication Number Publication Date
US20020175911A1 true US20020175911A1 (en) 2002-11-28

Family

ID=25340106

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/863,046 Abandoned US20020175911A1 (en) 2001-05-22 2001-05-22 Selecting a target object in three-dimensional space

Country Status (1)

Country Link
US (1) US20020175911A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050108724A1 (en) * 2001-10-15 2005-05-19 Keith Sterling Object distribution
US20060205502A1 (en) * 2005-03-10 2006-09-14 Nintendo Co., Ltd. Storage medium storing game program and game apparatus
US20090309872A1 (en) * 2005-11-29 2009-12-17 Yasuhiro Kawabata Object Selecting Device, Object Selecting Method, Information Recording Medium, And Program
CN102760308A (en) * 2012-05-25 2012-10-31 任伟峰 Method and device for node selection of object in three-dimensional virtual reality scene
US8732620B2 (en) 2012-05-23 2014-05-20 Cyberlink Corp. Method and system for a more realistic interaction experience using a stereoscopic cursor
US20140282267A1 (en) * 2011-09-08 2014-09-18 Eads Deutschland Gmbh Interaction with a Three-Dimensional Virtual Scenario
US8957855B2 (en) 2012-06-25 2015-02-17 Cyberlink Corp. Method for displaying a stereoscopic cursor among stereoscopic objects
CN105264571A (en) * 2013-05-30 2016-01-20 查尔斯·安东尼·史密斯 Hud object design and method
US20160041641A1 (en) * 2013-04-26 2016-02-11 Spreadtrum Communications (Shanghai) Co.,Ltd Method and apparatus for generating a three-dimensional user interface

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4600919A (en) * 1982-08-03 1986-07-15 New York Institute Of Technology Three dimensional animation
US5124914A (en) * 1987-05-21 1992-06-23 Commissariat A L'energie Atomique Method and device for obtaining tridimensional optical image formation from bidimensional measurements of attenuation of radiation through an object
US5163126A (en) * 1990-05-10 1992-11-10 International Business Machines Corporation Method for adaptively providing near phong grade shading for patterns in a graphics display system
US5608850A (en) * 1994-04-14 1997-03-04 Xerox Corporation Transporting a display object coupled to a viewpoint within or between navigable workspaces
US5689628A (en) * 1994-04-14 1997-11-18 Xerox Corporation Coupling a display object to a viewpoint in a navigable workspace
US5731819A (en) * 1995-07-18 1998-03-24 Softimage Deformation of a graphic object to emphasize effects of motion
US5732232A (en) * 1996-09-17 1998-03-24 International Business Machines Corp. Method and apparatus for directing the expression of emotion for a graphical user interface
US5736982A (en) * 1994-08-03 1998-04-07 Nippon Telegraph And Telephone Corporation Virtual space apparatus with avatars and speech
US5841440A (en) * 1996-12-17 1998-11-24 Apple Computer, Inc. System and method for using a pointing device to indicate movement through three-dimensional space
US6057859A (en) * 1997-03-31 2000-05-02 Katrix, Inc. Limb coordination system for interactive computer animation of articulated characters with blended motion data
US6208347B1 (en) * 1997-06-23 2001-03-27 Real-Time Geometry Corporation System and method for computer modeling of 3D objects and 2D images by mesh constructions that incorporate non-spatial data such as color or texture
US6337880B1 (en) * 1997-04-04 2002-01-08 Avid Technology, Inc. Indexing for motion video that is compressed using interframe and intraframe techniques
US6346956B2 (en) * 1996-09-30 2002-02-12 Sony Corporation Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium
US6388670B2 (en) * 1996-04-25 2002-05-14 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
US6396509B1 (en) * 1998-02-21 2002-05-28 Koninklijke Philips Electronics N.V. Attention-based interaction in a virtual environment
US6628307B1 (en) * 1999-11-03 2003-09-30 Ronald J. Fair User interface for internet application

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4600919A (en) * 1982-08-03 1986-07-15 New York Institute Of Technology Three dimensional animation
US4600919B1 (en) * 1982-08-03 1992-09-15 New York Inst Techn
US5124914A (en) * 1987-05-21 1992-06-23 Commissariat A L'energie Atomique Method and device for obtaining tridimensional optical image formation from bidimensional measurements of attenuation of radiation through an object
US5163126A (en) * 1990-05-10 1992-11-10 International Business Machines Corporation Method for adaptively providing near phong grade shading for patterns in a graphics display system
US5608850A (en) * 1994-04-14 1997-03-04 Xerox Corporation Transporting a display object coupled to a viewpoint within or between navigable workspaces
US5689628A (en) * 1994-04-14 1997-11-18 Xerox Corporation Coupling a display object to a viewpoint in a navigable workspace
US5736982A (en) * 1994-08-03 1998-04-07 Nippon Telegraph And Telephone Corporation Virtual space apparatus with avatars and speech
US5731819A (en) * 1995-07-18 1998-03-24 Softimage Deformation of a graphic object to emphasize effects of motion
US6388670B2 (en) * 1996-04-25 2002-05-14 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
US5732232A (en) * 1996-09-17 1998-03-24 International Business Machines Corp. Method and apparatus for directing the expression of emotion for a graphical user interface
US6346956B2 (en) * 1996-09-30 2002-02-12 Sony Corporation Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium
US5841440A (en) * 1996-12-17 1998-11-24 Apple Computer, Inc. System and method for using a pointing device to indicate movement through three-dimensional space
US6057859A (en) * 1997-03-31 2000-05-02 Katrix, Inc. Limb coordination system for interactive computer animation of articulated characters with blended motion data
US6337880B1 (en) * 1997-04-04 2002-01-08 Avid Technology, Inc. Indexing for motion video that is compressed using interframe and intraframe techniques
US6208347B1 (en) * 1997-06-23 2001-03-27 Real-Time Geometry Corporation System and method for computer modeling of 3D objects and 2D images by mesh constructions that incorporate non-spatial data such as color or texture
US6396509B1 (en) * 1998-02-21 2002-05-28 Koninklijke Philips Electronics N.V. Attention-based interaction in a virtual environment
US6628307B1 (en) * 1999-11-03 2003-09-30 Ronald J. Fair User interface for internet application

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7747715B2 (en) * 2001-10-15 2010-06-29 Jacobs Rimell Limited Object distribution
US20050108724A1 (en) * 2001-10-15 2005-05-19 Keith Sterling Object distribution
US8836639B2 (en) 2005-03-10 2014-09-16 Nintendo Co., Ltd. Storage medium storing game program and game apparatus
US20090187863A1 (en) * 2005-03-10 2009-07-23 Nintendo Co., Ltd. Storage medium storing input processing program and input processing apparatus
US20110227916A1 (en) * 2005-03-10 2011-09-22 Nintendo Co., Ltd. Storage medium storing game program and game apparatus
US8120574B2 (en) * 2005-03-10 2012-02-21 Nintendo Co., Ltd. Storage medium storing game program and game apparatus
US8139027B2 (en) 2005-03-10 2012-03-20 Nintendo Co., Ltd. Storage medium storing input processing program and input processing apparatus
US9849383B2 (en) * 2005-03-10 2017-12-26 Nintendo Co., Ltd. Storage medium storing game program and game apparatus
US20140349759A1 (en) * 2005-03-10 2014-11-27 Nintendo Co., Ltd. Storage medium storing game program and game apparatus
US20060205502A1 (en) * 2005-03-10 2006-09-14 Nintendo Co., Ltd. Storage medium storing game program and game apparatus
US20090309872A1 (en) * 2005-11-29 2009-12-17 Yasuhiro Kawabata Object Selecting Device, Object Selecting Method, Information Recording Medium, And Program
US20140282267A1 (en) * 2011-09-08 2014-09-18 Eads Deutschland Gmbh Interaction with a Three-Dimensional Virtual Scenario
US8732620B2 (en) 2012-05-23 2014-05-20 Cyberlink Corp. Method and system for a more realistic interaction experience using a stereoscopic cursor
CN102760308A (en) * 2012-05-25 2012-10-31 任伟峰 Method and device for node selection of object in three-dimensional virtual reality scene
US8957855B2 (en) 2012-06-25 2015-02-17 Cyberlink Corp. Method for displaying a stereoscopic cursor among stereoscopic objects
US20160041641A1 (en) * 2013-04-26 2016-02-11 Spreadtrum Communications (Shanghai) Co.,Ltd Method and apparatus for generating a three-dimensional user interface
US9684412B2 (en) * 2013-04-26 2017-06-20 Speadtrum Communications (Shanghai) Co., Ltd. Method and apparatus for generating a three-dimensional user interface
CN105264571A (en) * 2013-05-30 2016-01-20 查尔斯·安东尼·史密斯 Hud object design and method
KR20160013928A (en) * 2013-05-30 2016-02-05 찰스 안소니 스미스 Hud object design and method
US20160071320A1 (en) * 2013-05-30 2016-03-10 Charles Anthony Smith HUD Object Design and Method
US10217285B2 (en) * 2013-05-30 2019-02-26 Charles Anthony Smith HUD object design and method
KR102249577B1 (en) 2013-05-30 2021-05-07 찰스 안소니 스미스 Hud object design and method

Similar Documents

Publication Publication Date Title
US6356281B1 (en) Method and apparatus for displaying translucent overlapping graphical objects on a computer monitor
US6212577B1 (en) Method and apparatus for improved interaction with an application program according to data types and actions performed by the application program
KR101278793B1 (en) Object association in a computer generated drawing environment
US20080275850A1 (en) Image tag designating apparatus, image search apparatus, methods of controlling operation of same, and programs for controlling computers of same
US20020111699A1 (en) Dynamically configurable generic container
JP6593066B2 (en) Information processing apparatus, information processing method, and program
US9495719B2 (en) Multi-source, multi-destination data transfers
EP0551192A1 (en) Spatially organized computer display system
US20130215034A1 (en) Extraction method for multi-touch feature information and recognition method for multi-touch gestures using multi-touch feature information
JPH09146751A (en) Window display method and window management device
JP2000057359A (en) Method for changing display attribute of graphic object, method for selecting graphic object, graphic object display control device, storage medium storing program for changing display attribute of graphic object, and storing medium storing program for controlling selection of graphic object
US9986060B2 (en) Persistent caching of map imagery and data
JPH0661107B2 (en) Image recognition system and its operation method
AU742998B2 (en) Dynamic object linking interface
CA2430446A1 (en) Method of indexing entities
US20020175911A1 (en) Selecting a target object in three-dimensional space
US7913166B2 (en) Method and apparatus for implied editing action through directional and ordered data selection
CN104268012B (en) A kind of image data processing method and processing device
US11080472B2 (en) Input processing method and input processing device
JP2006277166A (en) Three-dimensional shape comparison program and three-dimensional similar shape retrieval program
JP2005085174A (en) Link destination content prereader, prereading method, and preread area calculation method
US7849470B2 (en) System and method for extending a programming language to include multiple dissimilar object systems
US20120284735A1 (en) Interaction-Based Interface to a Logical Client
US20090249205A1 (en) Display position determination apparatus and method thereof
US6625805B1 (en) Dynamic byte code examination to detect whether a GUI component handles mouse events

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTEL CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIGHT, JOHN J.;SMITH, MICHAEL D.;MILLER, JOHN D.;AND OTHERS;REEL/FRAME:011847/0203

Effective date: 20010517

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION