CN1845051A - Input position processing device - Google Patents

Input position processing device Download PDF

Info

Publication number
CN1845051A
CN1845051A CNA2005101329549A CN200510132954A CN1845051A CN 1845051 A CN1845051 A CN 1845051A CN A2005101329549 A CNA2005101329549 A CN A2005101329549A CN 200510132954 A CN200510132954 A CN 200510132954A CN 1845051 A CN1845051 A CN 1845051A
Authority
CN
China
Prior art keywords
aforementioned
coordinate
input
input coordinate
display part
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2005101329549A
Other languages
Chinese (zh)
Other versions
CN100538613C (en
Inventor
桥本英之
北山茂寿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Publication of CN1845051A publication Critical patent/CN1845051A/en
Application granted granted Critical
Publication of CN100538613C publication Critical patent/CN100538613C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An input position processing program detects a series of pieces of position data based on input positions outputted from a pointing device, and defines a reference position. When the last piece of position data indicates a point within a defined area other than a reference area, an operation is to be performed determined based on a direction that extends from the reference position to the defined area indicated by the last piece of position data. When an intermediate piece of position data indicates a point within a defined area other than the reference area and the last piece of position data indicates a point within the reference area, the operation to be performed is determined based on the combination of directions representing a reciprocal movement between the reference position and the defined area indicated by the intermediate piece of position data.

Description

Input coordinate processing method
Technical field
The present invention relates to input coordinate processing method, input coordinate treating apparatus, input coordinate handling procedure, record the recording medium of input coordinate handling procedure, relate in particular to utilization and export the input coordinate that the pointing devices such as touch panel of the coordinate information of regulation coordinate system are exported, decide input coordinate processing method, input coordinate treating apparatus, the input coordinate handling procedure of processing and the recording medium that records the input coordinate handling procedure.
Background technology
In the past, known had a kind ofly based on the coordinate imported of operation mouse, decides the technology of the predetermined process (to call next the processing in the following text) of carrying out next time.In addition, known also have a kind of coordinate of importing based on operating touch panel and mouse, carries out the technology of the display process of display unit.Such as, as " Mouse Gestures in Opera ", [online], [putting down into retrieval on March 15th, 17], the Internet<URL:http: //www.opera.com/features/mouse/〉described in (to call non-patent literature 1 in the following text), proposed a kind of web browser, can be called as the technology of the operation of " mouse gestures " at the webpage that is used for browsing internet.
In the browser described in the aforementioned non-patent literature 1,, after under the state of right-click mouse mouse being moved to the left, discharge right button, just then webpage enters the next page if in the background area of webpage.And in the background area of webpage, if under the state of right-click mouse, mouse is moved right after, discharge right button, then can turn back to the previous page.In addition, in the background area of webpage, if under the state of right-click mouse, mouse is moved down after, discharge right button, then can open a new window.In addition, in the background area of webpage, if after under the state of right-click mouse mouse being moved down, on move and discharge right button, a then reproducible window.
In addition,, after the right-click mouse, after under the state of right click mouse being moved up, discharge right button, then can in a new window, open hyperlink target if under the web page interlinkage state.In addition, if under the linking status of webpage, after the right-click mouse, after under the state of right click mouse being moved down, on move and discharge right button, then can in background, open hyperlink target.They are compared with following situation becomes different processing: under the linking status of webpage, under the occasion of left click mouse, open hyperlink target in parent window.
Like this, in aforementioned non-patent literature 1 disclosed mouse gestures, according to will be based on the direction of the track accumulation storage of the input coordinate of mouse, the perhaps combination of these directions decides next the processing.Thus, can save in aforementioned mouse gestures, in display frame, remove to click the button of configuration around webpage specially, perhaps the trouble of operation tool bar.
In the browser that is called as Si Laipuni (Sleipnir) etc., show the gesture that is identified in the left bottom of its picture.Such as, the direction with the track accumulation of input coordinate is stored is expressed as " ↑ ", " ↓ ", " ↑ ", " → " successively.The user can envision next the processing by watching this demonstration.
In addition, in aforementioned non-patent literature 1 disclosed mouse gestures, according to will be based on the direction of the track accumulation storage of the input coordinate of mouse, the display process based on display unit be carried out in the perhaps combination of these directions.
In addition, such as shown in No. 3095145 communique of Jap.P. (to call patent documentation 1 in the following text), a kind of following technology is disclosed: will be based on the track of the input coordinate of touch panel as stroke track, under the occasion of in this stroke track, not turning back, window is moved to the stroke moving target, in this stroke track, have under the occasion of turning back, store window into specified part.
Yet, in the browser described in the aforementioned non-patent literature 1, only once just depict under the occasion of suitable track the user, problem can not take place, but depicting under the occasion of unsuitable track, just problem can take place.That is, be,, then can more and more accumulate the track of beyong contemplation, can not make the user carry out desirablely handling eventually next time if continue to move in order to revise unsuitable track to the result who accumulates based on the track of the input coordinate of mouse.In addition, be this unsuitable track of deletion, the user must carry out after the right click of removing mouse, right click again once more, and describe this miscellaneous operation of desirable track once more.
On the other hand, also the someone considers not make up a plurality of directions, and the track based on the input coordinate of mouse is carried out the cumulative bad storage, and only based on a direction that obtains therefrom, decides next time and handle.Yet the variation that decision is handled next time becomes the quantity that can be distinguished by a direction, the operating gesture that indication is handled the next time shortage that just becomes.In addition, although the method for the quantity that increase distinguished by a direction is also arranged, but owing to the angle that is used for always judging by each side will diminish, thereby the direction that the user described under a lot of occasion is variant with the direction of being discerned by computing machine, thereby the user is difficult to operation.
In addition, as mentioned above, the browser that shows the operating gesture that is identified in the bottom in display frame left side is arranged also.Yet owing to display operation gesture on this not eye-catching position, bottom in display frame left side, thereby the user is difficult to identification.In addition,, but show, must one by one sight line be switched to the bottom in picture left side from the reading position of webpage, thereby have the problem very miscellaneous the user in order to see this operating gesture clearly even the user can discern.
In addition, in the aforementioned non-patent literature 1 described browser, in operation based on mouse gestures, user's sense usually not directly perceived.Such as, aforesaid " in the background area at webpage, under the state of right-click mouse, mouse being moved down " this operation, and " opening new window " this display process between, be difficult to produce relevance instinctively.Equally, aforesaid " in the background area at webpage, under the state of right-click mouse, mouse is moved down, move up then " this operation, and " copy window " this display process between, also be difficult to produce relevance instinctively.Therefore, the user must remember the relation of these operations with display process specially as a kind of special bound term.The user must remember this special bound term, and this can hinder effective use of mouse gestures function.
In addition, in the technology of aforementioned patent document 1, according to the touch operation of turning back based on the stroke track of the track of the input coordinate of touch panel, with the relation of depositing this display process of window, also only the user carries out alignment processing as a kind of special bound term specially, between this operation and display process, be difficult to produce relevance instinctively.But even the relevance between user's operation and the pairing display process is higher, its display process itself also is useless, thereby nonsensical.Therefore, also the higher display process of the relevance of hope and user's operation itself is useful.
Summary of the invention
For this reason, first purpose of the present invention is, providing a kind of is obtained preserving by the assignable processing variation number of coordinate input operation, simplify processing simultaneously at unsuitable track, and improve the coordinate input operability input coordinate processing method, input coordinate treating apparatus, input coordinate handling procedure and record the recording medium of input coordinate handling procedure.In addition, second purpose of the present invention is, provides a kind of user to be easy to discern the input coordinate processing method of the operating gesture of being discerned, input coordinate treating apparatus, input coordinate handling procedure and records the recording medium of input coordinate handling procedure.In addition, the 3rd purpose of the present invention is, provide a kind of by the coordinate input operation, utilize the intuitive operation to carry out input coordinate processing method, input coordinate treating apparatus, the input coordinate handling procedure of the high display process of serviceability and record the recording medium of input coordinate handling procedure.
For reaching aforementioned purpose, the present invention adopts following formation.Reference marks in the bracket and step number (abbreviate step as S, only put down in writing step number) etc. are used to understand the present invention, and the corresponding relation of expression and embodiment described later, and non-limiting scope of the present invention.
First aspect is a kind of input coordinate processing method, and it is by operating from the input coordinate (coordinate data corresponding with contact position) of pointing device 15 outputs according to user's operation.Input coordinate processing method comprises: input coordinate detects step (S43, S63, S68, S73, S84, S87, S92, S104, S107, S114, S120, S134, S139, S144); Reference coordinate storing step S44; Multizone is set step (Figure 11); Handle deciding step S50~S52.Input coordinate detects step, and the input coordinate according to from pointing device output detects the coordinate information DC1 based on regulation coordinate system (picture coordinate system).The reference coordinate storing step based on detect the initial coordinate information in detected a series of coordinate informations in the step at input coordinate, is set and Memory Reference coordinate DC2.Multizone is set step, forms a plurality of regional AM, AT, AB, AL, AR with reference coordinate as benchmark, and the zone that will comprise this reference coordinate is set at reference area AM.Handle deciding step,, decide processing based on detect last coordinate information in detected a series of coordinate informations in the step (touching the coordinate when disconnecting) and reference coordinate at input coordinate.The processing deciding step comprises: first handles deciding step S115, S135, S140, S145 and the second processing deciding step S121.First handles deciding step, when last coordinate information is represented in regional AT, AB, AL, the AR outside the reference area, based on deciding processing from the direction of reference coordinate to the zone shown in this last coordinate information.Second handles deciding step, after part coordinate information in input coordinate detection step in detected a series of coordinate informations has been represented in the zone outside the reference area (DF8 connection), when last coordinate information is represented in the reference area, based on from reference coordinate to the direction in the zone shown in this part coordinate information, and, decide processing from the combination of the zone shown in this part coordinate information to the direction of reference coordinate.In addition, pointing device is to specify the input position on the picture and the input media of coordinate, realizes such as waiting by touch panel, mouse, tracking plate, tracking ball.The coordinate system that each input media is used is touch panel coordinate system and picture coordinate system.In addition, aforementioned input coordinate processing method, the mode that can be used as input coordinate handling procedure performed the computing machine 21 by the device operated according to user operation and from the input coordinate of pointing device output and record the recording medium of this input coordinate handling procedure realizes.
For second aspect, in aforementioned first aspect, handle deciding step and also comprise: sign is set step S112, S132, S137, S142; The first sign step of updating S112, S132, S137, S142; The second sign step of updating S118.Sign is set step, and is regional when mobile to other from reference area in the position shown in the last coordinate information, sets and store this other area relative sign DF8, DF10~DF12.The first sign step of updating, after in sign setting step, having set sign, in the position shown in the last coordinate information from the zone that is set with this sign when other zone that is different from reference area is moved further, the sign of current setting is updated to and regional corresponding sign DF8, the DF10~DF12 shown in this last coordinate information, and is stored.The second sign step of updating, after in the sign setting step or the first sign step of updating, having set sign, in the position shown in the last coordinate information from the zone that is set with this sign when reference area is moved further, the sign of current setting is updated to the expression reference area and has set the reciprocal sign DF9 in the zone of this sign, and stored.First handles deciding step, based on set the sign that the step or the first sign step of updating are set by sign, decides processing.Second handles deciding step, and the sign based on being set by the second sign step of updating decides processing.
For the third aspect, in aforementioned first aspect, also comprise showing controlled step S113, S119, S133, S138, S143.Show controlled step, show the index M8~M12 that represents following direction respectively in display device 12, this direction is to have determined the reference coordinate handled to handle to other regional direction shown in the last coordinate information (go up direction, direction, left be to, right down) and second the deciding step to have determined the combinations of directions of handling in the deciding step from handling first.
For fourth aspect, in aforementioned first aspect, also comprise the demonstration controlled step.Show controlled step, be shown in the image M 7 of the reference area of setting in a plurality of zone enactment steps at the display device indicator gauge.
For the 5th aspect, in aforementioned first aspect, also comprise the demonstration controlled step.Show controlled step, at the image of display device demonstration based on coordinate system.A plurality of zone enactment steps will be that the image region segmentation of the periphery at center becomes a plurality of with the reference area, and form a plurality of zones.
The 6th aspect is a kind of input coordinate processing method, and it is by operating from the input coordinate of pointing device output according to user's operation.Input coordinate processing method comprises: input coordinate detects step; The reference coordinate storing step; First handles deciding step; Second handles deciding step; Show controlled step.Input coordinate detects step, and the input coordinate according to from pointing device output detects the coordinate information based on the regulation coordinate system.The reference coordinate storing step based on detect the initial coordinate information in detected a series of coordinate informations in the step at input coordinate, is set and the Memory Reference coordinate.First handles deciding step, and based on from the position shown in the reference coordinate, the direction to detect the position shown in the last coordinate information in detected a series of coordinate informations in the step at input coordinate decides processing.Second handles deciding step, based on from the position shown in the reference coordinate, to detecting in the step in the way in detected a series of coordinate informations the direction of the position shown in the coordinate information at input coordinate and the position shown in the coordinate information from this way, combination to the direction of the position shown in the reference coordinate decides processing.Show controlled step, at the image of display device demonstration based on coordinate system, and near this image the position shown in the last coordinate information, the index that shows the following direction of expression respectively, this direction are to handle first to have determined the direction of handling in the deciding step and determined the combinations of directions of handling in the second processing deciding step.Here, the position shown in the last coordinate information " near ", comprise at least position shown in the last coordinate information itself and with this position adjacent areas.In addition, aforementioned input coordinate processing method, the mode that can be used as input coordinate handling procedure performed the computing machine 21 by the device operated according to user operation and from the input coordinate of pointing device output and record the recording medium of this input coordinate handling procedure realizes.
The 7th aspect is a kind of input coordinate treating apparatus, and it is by operating from the input coordinate of pointing device output according to user's operation.The input coordinate treating apparatus has: storage unit 24; The input coordinate detecting unit; The reference coordinate storage processing unit; The multizone setup unit; Handle the decision unit.The input coordinate detecting unit, the input coordinate according to from pointing device output detects the coordinate information based on the regulation coordinate system.The reference coordinate storage processing unit based on the initial coordinate information in the detected a series of coordinate informations of input coordinate detecting unit, is set reference coordinate, and is stored storage unit into.The multizone setup unit is a benchmark with the reference coordinate, forms a plurality of zones, and the zone that will comprise this reference coordinate is set at reference area.Handle the decision unit, last coordinate information and reference coordinate based in the detected a series of coordinate informations of input coordinate detecting unit decide processing.Handle the decision unit and comprise that first handles the decision unit and the second processing decision unit.First handles the decision unit, in the time of in the zone beyond last coordinate information is represented reference area, based on from the direction of reference coordinate to zone shown in this last coordinate information, decides processing.Second handles the decision unit, after part coordinate information in the detected a series of coordinate informations of input coordinate detecting unit is represented in the zone beyond the reference area, when last coordinate information is represented in the reference area, based on from reference coordinate to the direction in zone shown in this part coordinate information, and from the combination of the zone shown in this part coordinate information to the direction of reference coordinate, decide processing.
Eight aspect is a kind of input coordinate treating apparatus, and it is by operating from the input coordinate of pointing device output according to user's operation.The input coordinate treating apparatus has: storage unit; The input coordinate detecting unit; The reference coordinate storage processing unit; First handles the decision unit; Second handles the decision unit; Indicative control unit.The input coordinate detecting unit, the input coordinate according to from pointing device output detects the coordinate information based on the regulation coordinate system.The reference coordinate storage processing unit based on the initial coordinate information in the detected a series of coordinate informations of input coordinate detecting unit, is set reference coordinate, and is stored storage unit into.First handles the decision unit, and based on from the position shown in the reference coordinate, the direction of the position shown in the last coordinate information in the detected a series of coordinate informations of input coordinate detecting unit decides processing.Second handles the decision unit, based on the direction of the position shown in the coordinate information from the way of the position shown in the reference coordinate to the detected a series of coordinate informations of input coordinate detecting unit and from this way the position shown in the coordinate information decide processing to the combination of the direction of the position shown in the reference coordinate.Indicative control unit, at the image of display device demonstration based on coordinate system, and near this image the position shown in the last coordinate information, the index of following direction is represented in demonstration respectively, and this direction is that the first processing decision unit determines the direction and second processing handled to determine the unit to determine the combinations of directions of handling.
The 9th aspect is a kind of input coordinate processing method, it is based on from the input coordinate (coordinate data corresponding with contact position) of pointing device 15 outputs, coming display frame one by one on first display part 12 that constitutes display device or 11 and second display part 11 or 12 according to user's operation.Input coordinate processing method comprises: input coordinate storing step S43, S63, S68, S73, S84, S87, S92, S104, S107, S114, S120, S134, S139, S144; Reference coordinate storing step S44; Treatment step S69, S88, S115, S135; Show controlled step S69, S88, S115, S135.The input coordinate storing step according to the input coordinate from pointing device output, detects and stores the coordinate information DC1 based on the displaing coordinate system (picture coordinate system) of the shown image of display device.The reference coordinate storing step, the initial coordinate information in a series of coordinate informations that will store in the input coordinate storing step is set at reference coordinate DC2 and is stored.Treatment step based at least one coordinate information in a series of coordinate informations, is obtained display image (with mark, link corresponding response image and a LCD11 or the shown image of the 2nd LCD12).Show controlled step, it with the reference coordinate benchmark, direction of operating shown in a series of coordinate information in the displaing coordinate system, expression from the position of first display part when the reference direction of the position of second display part (go up direction or direction) down (among S65 for being, in S83 for being, in S111 for being, in S131 for being), be presented at the display image of obtaining in the treatment step (Fig. 5, Fig. 7, Figure 10) at this second display part.In addition, the display image of obtaining in treatment step comprises with mark and links the corresponding response image and parts of images at least that first display part is shown etc.In addition, pointing device is to specify the input position on the picture and the input media of coordinate, realizes such as waiting by touch panel, mouse, tracking plate, tracking ball.The coordinate system that each input media is used is touch panel coordinate system and picture coordinate system.First display part and second display part can be the display parts that separates on the entity, also can be to be that a display frame is cut apart and formed on the entity.In addition, the configuration relation between first display part and second display part can be two pictures up and down, also can be about two pictures.In addition, also can be used as the performed input coordinate handling procedure of computing machine 21 and record this mode of recording medium of this input coordinate handling procedure, realize aforementioned input coordinate processing method, wherein, this computing machine, be based on according to user operation and from the input coordinate of pointing device output, carry out on first display part that constitutes display device and second display part display process of display frame one by one.
For the tenth aspect, aspect the aforementioned the 9th in, input coordinate processing method based on input coordinate, carries out coming file in download and the display process of the web browser read via the Department of Communication Force 33 that communicates with network on display device.In treatment step, when reference coordinate is positioned at the response region (mark, link) (at S46 for being, at S47 for being) of web browser, obtain the pairing response image of this response region.In showing controlled step, when direction of operating is represented reference direction, show response image (Fig. 5, Fig. 7) at second display part.
For the tenth on the one hand, in aspect the aforementioned the tenth, in treatment step, when reference coordinate is positioned at the background area of web browser (at S48 for being), obtain the shown parts of images at least of first display part (image of the 2nd LCD12 or the shown mistake of a LCD11).In showing controlled step, when direction of operating is represented reference direction, show the parts of images at least (Figure 10) that first display part is shown at second display part.
For the 12 aspect, aspect the aforementioned the tenth in, reference direction is that first display part for display device is provided with the last direction of second display part or direction down.In treatment step, be positioned at the background area of web browser in reference coordinate, and direction of operating represent left to or during right, obtain the image of in this treatment step, obtaining in the past.In showing controlled step, direction of operating represent left to or during right (at S136 for being, at S141 for being), be presented at over obtained image (S140, S145) at first display part.
For the 13 aspect, in aspect the aforementioned the 9th, in treatment step, at the track DC3 of a series of coordinate informations during (at S103 for being), obtain the enlarged image (S108) that the image in this regulation zone has been amplified around the shown regulation zone of first display part.In showing controlled step, show enlarged image (Fig. 9) at second display part.
For the 14 aspect, aspect the aforementioned the 9th in, reference direction is that first display part for display device is provided with the last direction of second display part or direction down.In treatment step, direction of operating represent left to or during right, obtain other page-images that is different from the shown page of first display part.In showing controlled step, direction of operating represent left to or during right, show other page-images at first display part.
For the 15 aspect, aspect the aforementioned the 9th in, the coordinate information based on from the pairing aforementioned displaing coordinate of the input coordinate of pointing device output system can only be set at one of shown image of the shown image of first display part and second display part.
The 16 aspect is a kind of input coordinate processing method, and it is based on, coming in first display part and the display frame one by one of second display part that constitute display device from the input coordinate of pointing device output according to user's operation.This input coordinate processing method comprises: the input coordinate storing step; The reference coordinate storing step; Treatment step (S74, S93, S121); Show controlled step (S74, S93, S121).The input coordinate storing step according to the input coordinate from pointing device output, detects and stores the coordinate information based on the displaing coordinate system of the shown image of display device.The reference coordinate storing step, the initial coordinate information in a series of coordinate informations that will store in the input coordinate storing step is set at reference coordinate and is stored.Treatment step based at least one coordinate information in a series of coordinate informations, is obtained display image.Show controlled step, it with the reference coordinate benchmark, direction of operating shown in a series of coordinate information in the displaing coordinate system, when being illustrated in the reciprocal reciprocating direction (above-below direction) of the configuration direction of first display part and second display part (in S70 for being, in S89 for being, in S116 and S117 for being,), show the parts of images at least that this second display part is shown at first display part, be presented at the display image of obtaining in the treatment step (Fig. 6, Fig. 8, Figure 12) at this second display part simultaneously.In addition, the display image of obtaining in treatment step comprises the shown parts of images at least of the response image and first display part etc.In addition, also can be used as the performed input coordinate handling procedure of computing machine 21 and record this mode of recording medium of this input coordinate handling procedure, realize aforementioned input coordinate processing method, wherein, this computing machine is based on according to user operation and from the input coordinate of pointing device output, carries out on first display part that constitutes display device and second display part display process of display frame one by one.
The 17 aspect is a kind of input coordinate processing method, and it is based on, coming in first display part and the display frame one by one of second display part that constitute display device from the input coordinate of pointing device output according to user's operation.Input coordinate processing method comprises: the input coordinate storing step; The reference coordinate storing step; Treatment step; Show controlled step.Wherein, the input coordinate storing step according to the input coordinate from pointing device output, detects and stores the coordinate information based on the displaing coordinate system of the shown image of display device.The reference coordinate storing step, the initial coordinate information in a series of coordinate informations that will store in the input coordinate storing step is set at reference coordinate and is stored.Treatment step based at least one coordinate information in a series of coordinate informations, is obtained display image.Show controlled step, it with the reference coordinate benchmark, direction of operating shown in a series of coordinate information in the displaing coordinate system, expression when the reference direction of the position of second display part, are presented at the display image in treatment step obtained at this second display part from the position of first display part.Show controlled step, when this direction of operating is illustrated in the reciprocal reciprocating direction of the configuration direction of first display part and second display part, show the parts of images at least that this second display part is shown at first display part, be presented at the display image of obtaining in the treatment step at second display part simultaneously.
The tenth eight aspect is a kind of input coordinate treating apparatus, and it is based on according to user's operation and from the input coordinate of pointing device output, carries out in first display part that constitutes display device and second display part display process of display frame one by one.This input coordinate treating apparatus has: storage unit 24; The input coordinate storage processing unit; The reference coordinate storage processing unit; Processing unit; Indicative control unit.The input coordinate storage processing unit according to the input coordinate from pointing device output, detects the coordinate information based on the displaing coordinate system of the shown image of display device, and stores storage unit into.The reference coordinate storage processing unit, the initial coordinate information in a series of coordinate informations that the input coordinate storage unit is stored is set at reference coordinate and stores storage unit into.Processing unit based at least one coordinate information in a series of coordinate informations, is obtained display image.Indicative control unit, it with the reference coordinate benchmark, direction of operating shown in a series of coordinate information in the displaing coordinate system, expression from the position of first display part when the reference direction of the position of second display part, at the obtained display image of this second display part display processing unit.
The 19 aspect is a kind of input coordinate treating apparatus, and it is based on according to user's operation and from the input coordinate of pointing device output, carries out on first display part that constitutes display device and second display part display process of display frame one by one.This input coordinate treating apparatus has: storage unit; The input coordinate storage processing unit; The reference coordinate storage processing unit; Processing unit; Indicative control unit.The input coordinate storage processing unit according to the input coordinate from pointing device output, detects the coordinate information based on the displaing coordinate system of the shown image of display device, and stores storage unit into.The reference coordinate storage processing unit, the initial coordinate information in a series of coordinate informations that the input coordinate storage processing unit is stored is set at reference coordinate and stores storage unit into.Processing unit based at least one coordinate information in a series of coordinate informations, is obtained display image.Indicative control unit, it with the reference coordinate benchmark, direction of operating shown in a series of coordinate information in the displaing coordinate system, when being illustrated in the reciprocal reciprocating direction of the configuration direction of first display part and second display part, show the parts of images at least that this second display part is shown at this first display part, simultaneously at the obtained display image of this second display part display processing unit.
The 20 aspect is a kind of input coordinate treating apparatus, and it is based on according to user's operation and from the input coordinate of pointing device output, carries out in first display part that constitutes display device and second display part display process of display frame one by one.This input coordinate treating apparatus has: storage unit; The input coordinate storage processing unit; The reference coordinate storage processing unit; Processing unit; Indicative control unit.The input coordinate storage processing unit according to the input coordinate from pointing device output, detects the coordinate information based on the displaing coordinate system of the shown image of display device, and stores storage unit into.The reference coordinate storage processing unit, the initial coordinate information in a series of coordinate informations that the input coordinate storage processing unit is stored is set at reference coordinate and stores storage unit into.Processing unit based at least one coordinate information in a series of coordinate informations, is obtained display image.Indicative control unit, it with the reference coordinate benchmark, direction of operating shown in a series of coordinate information in the displaing coordinate system, expression from the position of first display part when the reference direction of the position of second display part, at the obtained display image of this second display part display processing unit, when this direction of operating is illustrated in the reciprocal reciprocating direction of the configuration direction of first display part and second display part, show the parts of images at least that this second display part is shown at first display part, simultaneously at the obtained display image of this second display part display processing unit.
According to aforementioned first aspect, according to handling in the deciding step, determine following processing, promptly by discerning user's gesture from the combination of the pairing a plurality of direction of operating of input of pointing device and carrying out second.Therefore, compare, can produce the multiple variation handled of determining with the operating gesture of distinguishing the user by single direction of operating.On the other hand, handle in the deciding step first, can easily cancel gesture, only by till operating other zone outside the reference area based on direction of operating, just can only will be identified as effective operating gesture from reference coordinate to this regional direction of operating.Therefore, under the unsuitable occasion of operation that the user carried out, the user just can easily cancel this operation, and can provide new operation indication by operating continuous operation with this.That is, even can determine the variation of handling to increase because of the coordinate input operation, and the bad phenomenon that is produced under the occasion of having described unsuitable track obtains reducing, and can improve the operability of coordinate input.
According to aforementioned second aspect because with indicating the discrepancy manage a plurality of zones, thereby with compare based on the processing of all input coordinates, can distinguish user's gesture more simply.
According to the aforementioned third aspect, because the operating gesture discerned by the coordinate input operation that the user imported is shown as index, thereby the operating gesture of being imported of can discerning the user reliably and the processing that is determined.
According to aforementioned fourth aspect, become the reference area of the benchmark that is intended to distinguish operating gesture by demonstration, can be formed for the user and utilize pointing device to import the target image of valid function.
According to aforementioned the 5th aspect, be formed for being that a plurality of zones of distinguishing direction of operating are cut apart in the neighboring area at center with the reference area, thus, can easily distinguish with the reference area is the direction of operating at center.
According to aforementioned the 6th aspect, the operating gesture of discerning by the coordinate input operation that the user imported, the active user carries out the position of input operation to shown image near, show as index, thereby the operating gesture of being imported of can discerning the user reliably and the processing that is determined.
According to aforementioned the 9th aspect, adopted the operation of pointing device according to the user, can pass through intuitive operation, carry out the high display process of serviceability.Such as, if the user utilizes pointing device, operating to the direction of second display part from first display part, then obtained image (mark and link pairing response image and parts of images at least that first display part is shown) in treatment step just is presented at second display part on the direction of operating (from the direction of first display part towards second display part).Like this, the user just has intuitive to the direction of operating of pointing device input and corresponding therewith display process.And, in first display part, keep former display message, and in the information that second display part shows and processing is corresponding, this point all can be brought into play serviceability under various occasions.
According to aforementioned the tenth aspect, in the display process of web browser, second display part that can be on this direction of operating shows the response image that obtains by selected marker and hyperlink target.
According to the aforementioned the tenth on the one hand, in the display process of web browser, second display part that can be on this direction of operating shows the image that first display part is shown.
According to aforementioned the 12 aspect, in the display process of web browser, at a left side or the right operation pointing device different with reference direction, the what is called that can be equivalent to thus always represent with general right and left " is returned " display process that reaches " advancing ".
According to aforementioned the 13 aspect,, can amplify the desirable image of demonstration at second display part by the operation that the user centers on the shown parts of images of first display part.
According to aforementioned the 14 aspect, by with the different left side of reference direction or right operation pointing device, " the preceding page or leaf " that can be equivalent to generally always to represent with right and left in e-book etc. reach the display process of " back page ".
According to aforementioned the 15 aspect,, can carry out display process at other display part that can not operate input by a display part can operating input is operated.
According to aforementioned the 16 aspect, adopted the operation of pointing device according to the user, can pass through intuitive operation, carry out the high display process of serviceability.Such as, if the user utilizes pointing device, carry out the reciprocal operation of configuration direction at first display part and second display part, the shown parts of images at least of second display part then, just be presented at first display part on the direction of operating (from the direction of second display part) towards first display part, and the image of obtaining in treatment step (mark and link pairing response image and parts of images at least that first display part is shown) then is presented at second display part on another direction (from the direction of first display part to second display part) of direction of operating.Therefore, the shown information of display part on user's the reciprocal direction of operating (the reciprocal direction of direction), just Alternation Display at first display part and the second display part place.Like this, the user just has intuitive to the direction of operating of pointing device input and corresponding therewith display process.And, in first display part, keep former display message, and in the information that second display part shows and processing is corresponding, this point all can be brought into play serviceability under various occasions.
According to aforementioned the 17 aspect, can obtain and the aforementioned the 9th and the same effect in the 16 aspect.
According to input coordinate treating apparatus of the present invention, can obtain the effect same with aforementioned input coordinate processing method.In addition,, realize under the occasion of aforesaid input coordinate processing method, also can obtain the effect same with aforesaid input coordinate processing method with the input coordinate handling procedure and record the mode of the recording medium of this input coordinate handling procedure.
With reference to accompanying drawing, and in conjunction with following detailed description, can brighter dawn aforementioned content of the present invention and other purpose, feature, aspect and effect.
Description of drawings
Fig. 1 is an outside drawing of carrying out the game machine 1 of input coordinate handling procedure of the present invention,
Fig. 2 is the block scheme that the inside of the game machine 1 of presentation graphs 1 constitutes,
Fig. 3 is the figure that the picture of shown starting stage of expression the one LCD11 and the 2nd LCD12 shows example,
Fig. 4 is that the shown picture of the one LCD11 and the 2nd LCD12 showed the figure of example when the touch panel on the shown hyperlink target of the 2nd LCD12 15 had been carried out touching (tap) operation,
Fig. 5 is that the shown picture of the one LCD11 and the 2nd LCD12 shows the figure of example when upward touch panel 15 having been carried out sliding (slide) operation from the shown hyperlink target of the 2nd LCD12,
Fig. 6 is that the shown picture of the one LCD11 and the 2nd LCD12 shows the figure of example when upward touch panel 15 having been carried out slide reciprocal behind the slide downwards from the shown hyperlink target of the 2nd LCD12,
Fig. 7 is when touch panel 15 having been carried out slide downwards from the shown mark of the 2nd LCD12 (tab), and the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example,
Fig. 8 is that the shown picture of the one LCD11 and the 2nd LCD12 shows the figure of example when touch panel 15 having been carried out slide reciprocal upward behind the slide downwards from the shown mark of the 2nd LCD12,
Fig. 9 is carrying out slide to touch panel 15, thereby during around the shown partial information of the 2nd LCD12, the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example,
Figure 10 is when upward touch panel 15 being carried out slide from the shown background image of the 2nd LCD12, and the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example,
Figure 11 is used to illustrate when the touch panel on the shown background image of the 2nd LCD12 15 is carried out slide, the figure in each zone that sets,
When Figure 12 was slide reciprocal downwards upward touch panel 15 is carried out slide from the shown background image of the 2nd LCD12 after, the shown picture of a LCD11 and the 2nd LCD12 showed the figure of example,
Figure 13 is expression by carrying out the performed input coordinate handling procedure of game machine 1, the process flow diagram that makes game machine 1 carry out the action that input coordinate handles,
Figure 14 is the subroutine of the detailed action of the link designated treatment of step 50 among expression Figure 13,
Figure 15 is the subroutine that the mark designated treatment of step 51 among expression Figure 13 is moved in detail,
Figure 16 is the subroutine that the background designated treatment of step 52 among expression Figure 13 is moved in detail,
Figure 17 is the subroutine that the background designated treatment of step 52 among expression Figure 13 is moved in detail,
Figure 18 is the subroutine that the background designated treatment of step 52 among expression Figure 13 is moved in detail,
Figure 19 is expression to move the figure of various data one example that stores RAM24 into based on the processing of Figure 13.
Embodiment
With reference to accompanying drawing, the input coordinate processing method that one embodiment of the present invention are related is described.This input coordinate processing method can be realized by input coordinate treating apparatus, input coordinate handling procedure and the mode that records the recording medium of input coordinate handling procedure.Below, as a kind of embodiment of realizing input coordinate processing method of the present invention, utilize the input coordinate treating apparatus of carrying out the input coordinate handling procedure to describe.Input coordinate handling procedure of the present invention, can be suitable in the following manner, promptly utilize and to carry out in any computer system that display device shows, but, utilize to illustrate by input coordinate handling procedure performed in the game machine 1 as signal conditioning package (input coordinate treating apparatus) example.Fig. 1 is an outside drawing of carrying out the game machine 1 of input coordinate handling procedure of the present invention.Here, as game machine 1 one examples, the portable game machine is shown.
Among Fig. 1, game machine 1 comprises: a LCD (Liquid Crystal Display: the 11 and the 2nd LCD12 liquid crystal indicator).Housing 13 is made of last side body 13a and following side body 13b, and a LCD11 leaves side body 13a in, and the 2nd LCD12 leaves down side body 13b in.The resolution of the one LCD11 and the 2nd LCD12 is 256 point * 192 point.In addition, in the present embodiment, adopt LCD to be used as display device, adopted that (Electro Luminescence: display device electroluminescence) etc., other is display device arbitrarily such as EL but also can utilize.In addition, a LCD11 and the 2nd LCD12 can adopt resolution arbitrarily.
In last side body 13a, be formed with louver 18a, 18b, this hole is used for and will emits to the outside from the sound of a pair of loudspeaker described later (30a of Fig. 2,30b).
In following side body 13b, as input media, be provided with: crossbar switch 14a, starting switch 14b, selector switch 14c, A button 14d, B button 14e, X button 14f, Y button 14g, power switch 14h, L button 14L, R button 14R.In addition, as further input media, on the picture of the 2nd LCD12, touch panel 15 is installed also.In addition, in following side body 13b, also be provided with the insertion mouth that is used to storE storage card 17 and excellent bar 16.
As touch panel 15, such as utilizing resistive film mode and optical profile type (infrared mode) and electrostatic capacitance manifold type etc., the panel of any-mode.Touch panel 15 is the examples with pointing device of following function, promptly when with its surface of excellent bar 16 contacts, just exports the function of the coordinate data corresponding with this contact position.In addition, hereinafter, coming operating touch panel 15 with the user with excellent bar 16 is that example describes, but can certainly replace excellent bar 16, and adopts pen (stylus) and finger to come operating touch panel 15.In the present embodiment, as touch panel 15, adopt the resolution of resolution (accuracy of detection) and the 2nd LCD12 to be all the panel of 256 point * 192.But the resolution of the touch panel 15 necessarily resolution with the 2nd LCD12 is consistent.
Storage card 17 is the recording medium that records input coordinate handling procedure etc., freely installs in being arranged at the insertion mouth of lower case 13b with loading and unloading.This storage card 17 is recording medium one examples that record input processing program of the present invention.
Next, with reference to Fig. 2, illustrate that the inside of game machine 1 constitutes.Fig. 2 is the block scheme that the inside of expression game machine 1 constitutes.
Among Fig. 2, in housing 13, on the stored electronic circuit board 20, CPU core 21 is installed.On CPU core 21, be connected with connector 23 via bus 22, but also be connected with: input/output interface circuit (being designated as the I/F circuit in the accompanying drawing) 25; The one GPU (GraphicsProcessing Unit) 26; The 2nd GPU27; RAM24; Lcd controller 31; Wireless communication part 33.On connector 23, be connected with storage card 17 with freely loading and unloading.On storage card 17, be equipped with: the ROM17a of storage input coordinate handling procedure; Can rewrite the RAM17b of ground store backup data.The input coordinate handling procedure of being stored among the ROM17a of storage card 17 is loaded into RAM24, is loaded into the input coordinate handling procedure of RAM24, is carried out by CPU core 21.In RAM24, except the input coordinate handling procedure, also suitably store the data of the temporary transient data that are used to generate CPU core 21 executive routines and obtain etc.On I/F circuit 25, be connected with: touch panel 15, right loudspeaker 30a, left speaker 30b, and the operating switch portion 14 that formed by the crossbar switch 14a of Fig. 1 and A button 14d etc.Right loudspeaker 30a and left speaker 30b are disposed at the inboard of louver 18a and 18b respectively.
The one GPU26 is connected with a VRAM (Video RAM) 28, and the 2nd GPU27 is connected with the 2nd VRAM29.The one GPU26, according to the indication from CPU core 21, the data that are used to generate display image based on storing among the RAM24 generate first display image, and are painted into a VRAM28.The 2nd GPU27, same according to indication from CPU core 21, generate second display image, and be painted into the 2nd VRAM29.The one VRAM28 and the 2nd VRAM29 are connected with lcd controller 31.
Lcd controller 31 comprises register 32.Register 32 is stored 0 or 1 value according to the indication from CPU core 21.Lcd controller 31 is that first display image that will describe in a VRAM28 is exported to a LCD11, and second display image that will describe in the 2nd VRAM29, exports to the 2nd LCD12 under 0 the occasion in the value of register 32.In addition, be that first display image that will describe in a VRAM28 is exported to the 2nd LCD12, and second display image that will describe in the 2nd VRAM29, exports to a LCD11 under 1 the occasion in the value of register 32.
Wireless communication part 33, have and the wireless communication part 33 of other game machine between, the data of being utilized in the exchange game processing and the function of other data as an example, provide the radio communication function of the WLAN standard that meets IEEE802.11.Like this, wireless communication part 33 is exported to CPU core 21 with the data that receive.In addition, wireless communication part 33 sends to other game machine with CPU core 21 indicated data.In addition, be installed to storage part in wireless communication part 33 and the game machine 1 by browser with TCP/IP agreements such as (Transmission Control Protocol/Internet Protocol) and regulation, game machine 1 just can be via wireless communication part 33, comes to be connected with network such as the Internet.Therefore, game machine 1, can download network on data such as disclosed document and image, and can on a LCD11 and the 2nd LCD12, read.
In addition, input coordinate handling procedure of the present invention not only offers computer system by storage card 17 external memory medias such as grade, also can pass through the wired or wireless communication circuit, offers computer system.In addition, the input coordinate handling procedure also can record the Nonvolatile memory devices of inside computer system in advance.In addition, the information storage medium as storage input coordinate handling procedure is not limited to aforementioned nonvolatile semiconductor memory, also can be CD-ROM, DVD or optical-disk type medium similar with it.
Next, with reference to Fig. 3~Figure 12, before the 1 performed concrete processing based on the input coordinate handling procedure of explanation game machine is moved, the example of the display mode that shows because of this processing action is described earlier on a LCD11 and the 2nd LCD12.For explanation is specialized, utilize following example to illustrate, that is: be connected via networks such as wireless communication part 33 and the Internets, and by a LCD11 and the 2nd LCD12 data such as disclosed document and image on this network of reading.In addition, Fig. 3 is the figure of the picture demonstration example of shown starting stage of expression the one LCD11 and the 2nd LCD12.Fig. 4 is when the touch panel on the shown hyperlink target of the 2nd LCD12 15 has been carried out touching operation, and the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example.Fig. 5 is when upward touch panel 15 having been carried out slide from the shown hyperlink target of the 2nd LCD12, and the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example.Fig. 6 has carried out behind the slide touch panel 15 from the shown hyperlink target of the 2nd LCD12 upward, and when having carried out reciprocal slide downwards, the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example.Fig. 7 is when touch panel 15 having been carried out slide downwards from the shown mark of the 2nd LCD12, and the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example.Fig. 8 has carried out behind the slide touch panel 15 downwards from the shown mark of the 2nd LCD12, and when having carried out reciprocal upward slide, the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example.Fig. 9 is carrying out slide to touch panel 15, thereby during around the shown partial information of the 2nd LCD12, the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example.Figure 10 is when upward touch panel 15 having been carried out slide from the shown background image of the 2nd LCD12, and the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example.Figure 11 is used to illustrate when the touch panel on the shown background image of the 2nd LCD12 15 is carried out slide the figure in each zone that sets.Figure 12 is upward touch panel 15 is carried out slide from the shown background image of the 2nd LCD12 after, and when carrying out reciprocal slide downwards, the shown picture of a LCD11 and the 2nd LCD12 shows the figure of example.
Among Fig. 3,, read under the occasion of information such as disclosed document and image on the network,, only on the 2nd LCD12, show the information of the reading of current selection as starting stage one example with game machine 1 via wireless communication part 33.Such as, on the 2nd LCD12, the selected home tip of explicit user.In an example shown in Figure 3,, show the text link of expression hyperlink target and the mark of image links and the change of the out of Memory in this home tip reading as the response region in the home tip.In addition, as the network in the game machine 1 reading with and the icon set is set up in parallel at the display frame left end of the 2nd LCD12 and shows.As one of these icons, be set with around the pattern icon.
Among Fig. 4, when the user utilizes excellent bar 16, when coming touch panel 15 to (among Fig. 4 on the image links) on the shown hyperlink target of the 2nd LCD12 to carry out touch operation, the round M1 (state of left figure among Fig. 4) that to show with this touch operation point be the center.Like this, when the user with excellent bar 16 when touch panel 15 leaves, the home tip of the hyperlink target of touch operation (response image) just is presented at the 2nd LCD12 and goes up (state of right figure among Fig. 4).In addition, the shown home tip of the 2nd LCD12 changes to the home tip of hyperlink target, but the show state of a LCD11 is constant.Below, with aforementioned touch panel 15 is carried out the operation left after the moment touch operation, be called and touch operation.
Among Fig. 5, when the user to the shown hyperlink target of the 2nd LCD12 on the touch panel 15 of (among Fig. 5 for text link on) carry out touch operation, not when touch panel 15 leaves upward slider bar bar 16, near current touch operation point, show to upward arrow M2 (state of left figure among Fig. 5).Below, will carry out touch operation to touch panel 15, do not leave the operation of sliding from touch panel 15, be called slide.Particularly, the user has been touched the pairing picture coordinate of touch input coordinate of touch panel 15 at first, (x1 y1), and in the same frame coordinate system, sets and judges line y=y1+c1 to be made as the initial point coordinate.Here, c1 is the constant of regulation.When on touch panel 15, carry out slide upward, (the touch input coordinate (xt, yt)) among Fig. 5 just shows to upward arrow M2 in the time of till the zone on reaching aforementioned judgement line.
Then, after the user has carried out aforementioned slide, zone on the aforementioned judgement line, after excellent bar 16 left from touch panel 15, just on a LCD11, show that the hyperlink target carried out touch operation at first is (that is, with the home tip of initial point coordinate (x1, y1) Dui Ying hyperlink target).Like this, the 2nd LCD12 does not change demonstration, continues in statu quo to show home tip (the right figure state of Fig. 5).Promptly, the display message of the 2nd LCD12 still keeps, simultaneously, on the display part (LCD11) that is carried on user's the slide direction (from the top of the 2nd LCD12), be presented at information specified in the initial touch operation (response image) towards a LCD11.
Among Fig. 6, when the user to the shown hyperlink target of the 2nd LCD12 on (among Fig. 6 for text link on) carry out touch operation, do not leave from touch panel 15, on touch panel 15, carry out slide upward, carry out then when reciprocal downwards slide, near current touch operation point, show above-below direction arrow M3 (state of left figure among Fig. 6).Particularly, same with Fig. 5, set the initial point coordinate (x1, y1) and judge line y=y1+c1.Like this, on touch panel 15, carry out slide upward, after till the zone on reaching on the aforementioned judgement line, carry out slide round about, (touch input coordinate (the xt among Fig. 6 in the time of till the zone under reaching this judgement line, yt)), show above-below direction arrow M3.
After the user has carried out aforementioned slide up and down reciprocatingly, zone under aforementioned judgement line, after excellent bar 16 left from touch panel 15, just on the 2nd LCD12, show carried out touch operation at first hyperlink target (promptly, home tip (response image) with initial point coordinate (x1, y1) Dui Ying hyperlink target).Like this, a LCD11 just shows before this at the shown home tip of the 2nd LCD12 (the right figure state of Fig. 6).Promptly, the former display message of the 2nd LCD12 is shown in a LCD11, specified information is shown in the 2nd LCD12 in initial touch operation, and, come Alternation Display to go up shown information at display part (LCD11 and the 2nd LCD12) up and down according to user's slide direction (at the reciprocal above-below direction of configuration direction of a LCD11 and the 2nd LCD12).
Among Fig. 7, when the user to the shown mark of the 2nd LCD12 on (among Fig. 7, on mark " xxx ") touch panel 15 carry out touch operation after, with mark (promptly, just be presented at once on the LCD11 with the out of Memory (response image) in the home tip in the corresponding reading of initial point coordinate (x1, y1) Dui Ying mark " xxx ").Like this, do not leave when having carried out the operation of slider bar bar 16 downwards, near current touch operation point, show downward arrow M4 (state of left figure among Fig. 7) from touch panel 15.Particularly, the user has been touched the pairing picture coordinate of touch input coordinate of touch panel 15 at first, (x1 y1), and in this picture coordinate system, sets and judges line y=y1-c2 as the initial point coordinate.Here, c2 is the constant of regulation.When on touch panel 15, carry out slide downwards, (the touch input coordinate (xt, yt)) among Fig. 7 just shows downward arrow M4 in the time of till the zone under reaching aforementioned judgement line.
After the user has carried out aforementioned slide, zone under aforementioned judgement line, excellent bar 16 when touch panel 15 leaves, is just become the state (the right figure state of Fig. 7) that the 2nd LCD12 shows the information that a LCD11 corresponding with the mark of institute touch operation is shown.In addition, the shown home tip of the 2nd LCD12 changes to the out of Memory corresponding with mark, but a LCD11 continues to show the out of Memory corresponding with mark.Promptly, the display message of the one LCD11 still keeps, simultaneously, user's slide direction (from a LCD11 towards the 2nd LCD12 following to) on the display part (the 2nd LCD12) that carried, be presented at information specified in the initial touch operation (response image).In addition, in the right figure of Fig. 7, a LCD11 and the 2nd LCD12 all are presented at information specified in the initial touch operation (response image), but at a LCD11, also can be presented at the preceding shown prime information of this touch operation once more.
Among Fig. 8, when the user to the shown mark of the 2nd LCD12 on (among Fig. 8 for mark " xxx " on) when carrying out touch operation, with mark (promptly, out of Memory (response image) in the home tip in the reading of initial point coordinate (x1, y1) pairing mark " xxx ") correspondence just is presented on the LCD11 at once.Like this, downwards touch panel 15 has not been carried out behind the slide with leaving, when carrying out reciprocal upward slide, near current touch operation point, shown above-below direction arrow M5 (state of left figure among Fig. 8) from touch panel 15.Particularly, same with Fig. 7, set the initial point coordinate (x1, y1) and judge line y=y1-c2.On to touch panel 15, carried out slide downwards, after till the zone that reaches under the aforementioned judgement line, carried out slide round about, (touch input coordinate (the xt among Fig. 8 in the time of till the zone on reaching this judgement line, yt)), just show arrow M5 up and down.
After the user had carried out aforementioned slide up and down reciprocatingly, the zone on the aforementioned judgement line when touch panel 15 leaves, just on the 2nd LCD12, showed the information corresponding with the mark that has carried out touch operation with excellent bar 16.Like this, a LCD11 just shows the shown former home tip (the right figure state of Fig. 8) of the 2nd LCD12 before this.Promptly, former display message among the 2nd LCD12 is shown in a LCD11, specified information is shown in the 2nd LCD12 in initial touch operation, and, come Alternation Display to go up shown information at display part (LCD11 and the 2nd LCD12) up and down according to user's slide direction (at the reciprocal above-below direction of configuration direction of the 2nd LCD12 and a LCD11).
Among Fig. 9, when the user touched operation to centering on pattern with icon 1a, game machine 1 just shifted to moving around pattern.In pattern, when the user with around the shown partial information of the 2nd LCD12 (among Fig. 9, be the Word message of being put down in writing in the background image) mode, utilize when having carried out touch operation on 16 pairs of touch panels 15 of excellent bar, just according to this touch operation track show track M6 (the left figure state of Fig. 9).Particularly, the user has been carried out the pairing picture coordinate of touch input coordinate that touches to touch panel 15 at first, as initial point coordinate (x1, y1), and in the same frame coordinate system, (x1 y1) is the initial point near zone at center with this initial point coordinate in setting.When in the mode on the touch panel 15, carry out slide and (among Fig. 9, touch input coordinate (xt, yt)), just show track M6 till reaching aforementioned initial point near zone the time.In the initial point near zone, the user makes excellent bar 16 when touch panel 15 leaves, and the information that is centered on by aforementioned track M6 is just amplified (enlarged image) and is presented on the LCD11.The 2nd LCD12 does not change demonstration, in statu quo continues to show home tip (the right figure state of Fig. 9).That is, the display message among the 2nd LCD12 still keeps, and the information (enlarged image) that is centered on by touch operation simultaneously is presented on other display part (LCD11).
Among Figure 10, when the user carries out touch operation to the touch panel on the shown background image of the 2nd LCD12 15, do not leave from touch panel 15, when carrying out slide upward, near the initial point coordinate, show guide image M 7, and near current touch operation point, show to upward arrow M8 (the left figure state of Figure 10).
Particularly, as shown in figure 11, the user has been carried out the pairing picture coordinate of touch input coordinate that touches to touch panel 15 at first, (x1 y1), and as benchmark, sets a plurality of zones in the same frame coordinate system as the initial point coordinate.(x1, periphery y1) are set the regional AM of regulation at the initial point coordinate.The picture coordinate system (x, y) in, regulation zone AM is x1-k1≤x≤x1+k1, and the zone of y1-k2<y<y1+k2.Here, k1 and k2 are respectively the constants of regulation.Like this, on the border of the regional AM of regulation, show guide image M 7.
Above the regional AM of regulation, set upper area AT.The picture coordinate system (x, y) in, upper area AT becomes x1-k1≤x≤x1+k1, and the zone of y 〉=y1+k2.Below the regional AM of regulation, set lower zone AB.The picture coordinate system (x, y) in, lower zone AB is x1-k1≤x≤x1+k1, and the zone of y≤y1-k2.In the left side of the regional AM of regulation, set left field AL.The picture coordinate system (x, y) in, left field AL is the zone of x<x1-k1.Then, on the right side of the regional AM of regulation, set right side area AR.The picture coordinate system (x, y) in, right side area AR is the zone of x>x1+k1.As shown in figure 10, on touch panel 15, upwards carry out slide, until reaching regulation zone AM (promptly, 7 region surrounded of guide image M) in the time of till the upper area AT on (the touch input coordinate among Figure 10 (xt, yt)), just show to upward arrow M8.
After the user had carried out aforementioned slide, in the regional up AT, when touch panel 15 left, (that is, initial point coordinate (x1, y1) specified home tip) just was presented on the LCD11 by the home tip in the middle of the 2nd LCD12 reading with excellent bar 16.The 2nd LCD12 does not change demonstration, still continues to show home tip (state of right figure among Figure 10).Promptly, the display message of the 2nd LCD12 still keeps, simultaneously, user's slide direction (from the 2nd LCD12 to a LCD11 upward to) starting point shown in information (specified the 2nd LCD12 goes up shown information the initial touch operation), just be presented on the display part (LCD11) that is carried on this slide direction.
Among Figure 12, when the user carries out touch operation to the touch panel on the shown background image of the 2nd LCD12 15, do not carry out slide upward from touch panel 15 with leaving, till reaching upper area AT, carry out slide up and down reciprocatingly then downwards, (touch input coordinate (the xt among Figure 12 in the time of till in reaching regulation zone AM, yt)), just near the initial point coordinate, show guide image M 7, and near current touch operation point arrow M9 (the left figure state of Figure 12) about the demonstration.This moment each zone of setting, with illustrated in fig. 11 regional identical.
After the user has carried out aforementioned slide up and down reciprocatingly, in the regional AM of regulation, with excellent bar 16 when touch panel 15 leaves, by the home tip in the middle of the 2nd LCD12 reading (promptly, initial point coordinate (x1, y1) specified home tip) just is presented on the LCD11.Just be presented at the 2nd LCD12 by the home tip in the middle of the LCD11 reading and go up (state of right figure among Figure 12).Promptly, display message among the 2nd LCD12 is shown in a LCD11, the display message of the one LCD11 then is shown in the 2nd LCD12, according to user's slide direction (at the reciprocal above-below direction of configuration direction of a LCD11 and the 2nd LCD12), come Alternation Display to go up shown information at display part (LCD11 and the 2nd LCD12) up and down.
Next, with reference to Figure 13~Figure 19, the concrete processing action of the input coordinate handling procedure of being carried out by game machine 1 is described.In addition, Figure 13 is that expression is passed through to carry out this input coordinate handling procedure, makes game machine 1 carry out the process flow diagram of the action of input coordinate processing.Figure 14 is the subroutine of the detailed action of the link designated treatment of step 50 among expression Figure 13.Figure 15 is the subroutine of the detailed action of the mark designated treatment of step 51 among expression Figure 13.Figure 16~Figure 18 is the subroutine of the detailed action of the background designated treatment of step 52 among expression Figure 13.Figure 19 is expression to move the figure of various data one example that stores RAM24 into based on the processing of Figure 13.In addition, be used to carry out the program of these processing, in the input coordinate handling procedure that is included among the ROM17a to be deposited, when the power connection of game machine 1, read to RAM24 from ROM17a, and carry out by CPU core 21.In addition, specialize, utilize via networks such as wireless communication part 33 and the Internets to be connected, and utilize the example of data such as disclosed document and image on this network of reading, next action is described by a LCD11 and the 2nd LCD12 for making explanation.
At first, after the power supply (not shown) of game machine 1 is connected, come executive routine (not shown) by CPU core 21, thus, the input coordinate handling procedure of being deposited in the storage card 17 just is uploaded to RAM24.This input coordinate handling procedure of being uploaded is carried out by CPU core 21, carries out step shown in Figure 13 (abbreviating " S " among Figure 13~Figure 18 as) thus.
Among Figure 13, CPU core 21 shows information (step 41 such as disclosed document and image in the designated webpage at the 2nd LCD12 at least according to user's operation; With reference to Fig. 3).In addition, CPU core 21 is set at disconnection (step 42) with the touch input sign DFt that is stored among the RAM24, and changes the processing of next step over to.In addition, in abovementioned steps 42, CPU core 21 all is set at disconnection with first~the tenth two modes sign DF1~DF12 that is stored among the RAM24.
Here, as shown in figure 19, from the coordinate data of touch panel 15 input, in time converted to the coordinate on the shown image of pairing the 2nd LCD12 of contact position that touched touch panel 15, and stored into RAM24 as touching input coordinate DC1.In addition, in RAM24,, in addition store initial point coordinate DC2 and trajectory coordinates DC3 etc. aptly as the position data D C that is used to generate image.In addition, in RAM24, except touching input sign DFt, the flag data DF as the predetermined process that is used to determine execution next time (handling to call next time in the following text) stores first~the tenth two modes sign DF1~DF12.In addition, the view data DI as being used to generate the image of representing the identifying operation gesture stores indicatrix aptly as DI1, trace image DI2 and guide image DI3 etc.
Turn back to Figure 13, CPU core 21 judges whether the touch input (step 43) from the touch panel of operating corresponding to the user 15.Then, under the occasion that the input of touching is arranged, change processing over to next step 44.And under the occasion that does not touch input, just judge whether to finish the reading (step 53) of the information of current demonstration.Then, CPU core 21 turns back to abovementioned steps 43, and handles repeatedly under the occasion that continues reading, under the occasion that finishes reading, then finishes the processing based on this program.
In step 44, CPU core 21, the coordinate (that is, current touch input coordinate DC1) on the image that pairing the 2nd LCD12 of contact position of the touch panel 15 of current touch is shown, DC2 stores RAM24 into as the initial point coordinate.Next, CPU core 21 is set at connection (step 45) with the touch input sign DFt that stores among the RAM24, and makes processing enter next step.
Next, CPU core 21, judge respectively the image corresponding with the position shown in the initial point coordinate DC2 whether be link (step 46), whether be mark (step 47), whether be background image (step 48).At the image corresponding with the position shown in the initial point coordinate DC2 be (in step 46 for being) under the occasion that links such as image links, text link, CPU core 21 links designated treatment (step 50), and turns back to abovementioned steps 42 re-treatments.Under the image corresponding with the position shown in the initial point coordinate DC2 is the occasion of mark (in the step 47 for being), then CPU core 21 carries out mark designated treatment (step 51), and turns back to abovementioned steps 42 re-treatments.In addition, under the image corresponding with the position shown in the initial point coordinate DC2 is the occasion of background image (in the step 48 for being), then CPU core 21 carries out background designated treatment (step 52), and turns back to abovementioned steps 42 re-treatments.In addition, the image corresponding with the position shown in the initial point coordinate DC2 be not link, under any one the occasion of mark and background image (in step 46~48, being not), then CPU core 21 carries out and the corresponding processing (step 49) in position shown in the initial point coordinate DC2, and turns back to abovementioned steps 42 re-treatments.Below, the detailed action that links designated treatment, mark designated treatment, reaches the background designated treatment is described respectively.
Among Figure 14, during link designated treatment in carrying out abovementioned steps 50, at first, CPU core 21 is set at connection (step 61) with the first mode flags DF1 that is stored among the RAM24.Then, CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, shows with current touch input coordinate DC1 to be the round M1 (step 62 at center; With reference to Fig. 4), and judge whether the user has interrupted touch (step 63).CPU core 21 under the occasion of having interrupted touching the user, makes to handle to enter next procedure 64, and is proceeding under the occasion of touch operation, then makes to handle to enter step 65.
In step 64, CPU core 21 based on the current first mode flags DF1 that is set to connection, is carried out first pattern, and finishes the processing based on this subroutine.Here, so-called first pattern means the processing that shows that illustrates with the right figure of Fig. 4, is the display mode that does not change a LCD11, shows the processing of home tip of the hyperlink target of touch operation at the 2nd LCD12.
In step 65, CPU core 21 is judged current touch input coordinate DC1, whether is in the image-region (hereinafter referred to as the zone of judging more than the line) that comprises on the judgement line of judging on the line.DC1 is under the occasion of judging the zone on the line at the touch input coordinate, and CPU core 21 makes to handle and enters next step 66, is being under the occasion of judging the zone under the line, turns back to abovementioned steps 62 and re-treatments.Here, CPU core 21, (x1 y1) is benchmark, with y=y1+c1, sets judgement line used in the step 65 (c1 is a constant) with initial point coordinate DC2.In addition, if (xt yt) is yt 〉=y1+c1 to current touch input coordinate DC1, and then CPU core 21 is judged as, and current touch input coordinate DC1 is in the above zone of aforementioned judgement line.
In step 66, CPU core 21 is set at connection with the second mode flags DF2 that is stored among the RAM24, and the first mode flags DF1 and three-mode sign DF3 are set at disconnection.Then, CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, near current touch input coordinate DC1, shows to upward arrow M2 (step 67; With reference to Fig. 5), and judge whether the user has interrupted touch (step 68).CPU core 21 under the occasion of having interrupted touching the user, makes to handle to enter next procedure 69, and is proceeding under the occasion of touch operation, then makes to handle to enter step 70.
In step 69, CPU core 21 based on the current second mode flags DF2 that is set at connection, is carried out second pattern, and finishes the processing based on this subroutine.Here, so-called second pattern means the processing that shows that illustrates with the right figure of Fig. 5, is the display mode that does not change the 2nd LCD12, shows the processing of the home tip of hyperlink target at a LCD11.
In step 70, CPU core 21 is judged current touch input coordinate DC1, whether is in the zone under the used judgement line of abovementioned steps 65.DC1 is under the occasion of judging the zone under the line at the touch input coordinate, and CPU core 21 makes to handle and enters next step 71.On the other hand, be under the occasion of judging the zone on the line, if the second mode flags DF2 is set to connection, just then CPU core 21 turns back to abovementioned steps 67 re-treatments, if the second mode flags DF2 is set to disconnection, then turn back to abovementioned steps 66 re-treatments (step 75).
In step 71, CPU core 21 is set at connection with the three-mode sign DF3 that is stored among the RAM24, and the second mode flags DF2 is set at disconnection.CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, near current touch input coordinate DC1, shows arrow M3 (step 72 up and down; With reference to Fig. 6), and judge whether the user has interrupted touch (step 73).CPU core 21 under the occasion of having interrupted touching the user, makes to handle to enter next procedure 74, and is proceeding under the occasion of touch operation, then turns back to abovementioned steps 70 re-treatments.
In step 74, CPU core 21 based on the current three-mode sign DF3 that is set to connection, is carried out three-mode, and finishes the processing based on this subroutine.Here, so-called three-mode means the processing that shows that illustrates with the right figure of Fig. 6, is to show display message among the 2nd LCD12 at a LCD11, and shows the processing of the home tip of hyperlink target at the 2nd LCD12.
Here, in abovementioned steps 71, after three-mode sign DF3 is set to connection, when carrying out till the zone of slide more than aforementioned judgement line (, abovementioned steps 70 for not), carry out abovementioned steps 66.Therefore, three-mode sign DF3 is set to disconnection, and once more the second mode flags DF2 is set at connection.Promptly, from being cancelled to the slide form downwards of touch panel 15 of having judged employing till the zone of zone to this judgement line more than the line, slide upward till the zone on having only from initial point coordinate DC2 to this judgement line is considered to effective operating gesture.
Among Figure 15, during mark designated treatment in carrying out abovementioned steps 51, at first, CPU core 21 is set at connection (step 81) with the four-mode sign DF4 that is stored among the RAM24.Next, CPU core 21 based on the current four-mode sign DF4 that is set at connection, is carried out four-mode (step 82), and makes processing enter next procedure.Here, so-called four-mode means following processing, that is: directly on a LCD11, out of Memory in the pairing reading of mark (that is, with the corresponding mark of initial point coordinate DC2) of demonstration touch operation in the home tip (such as, with reference to left figure of Fig. 7 and the left figure of Fig. 8).
Next, CPU core 21 is judged current touch input coordinate DC1, whether is in the image-region (hereinafter referred to as the zone of judging below the line) (step 83) that comprises under the judgement line of judging on the line.DC1 is under the occasion of judging the zone under the line at the touch input coordinate, and CPU core 21 makes to handle and enters next step 85.And be under the occasion of judging the zone on the line at touch input coordinate DC1, CPU core 21 judges whether the user has interrupted touch (step 84).CPU core 21 turns back to abovementioned steps 83 re-treatments under the occasion that continues touch operation, under the occasion of having interrupted touching the user, finish the processing based on this subroutine.Here, CPU core 21, (x1 y1) is benchmark, with y=y1-c2, sets judgement line used in the step 83 (c2 is a constant) with initial point coordinate DC2.If (xt yt) is yt≤y1-c2 to current touch input coordinate DC1, and then CPU core 21 is judged as, and current touch input coordinate DC1 is in the following zone of aforementioned judgement line.
In step 85, CPU core 21 is set at connection with the 5th mode flags DF5 that is stored among the RAM24, and four-mode sign DF4 and the 6th mode flags DF6 are set at disconnection.CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, near current touch input coordinate DC1, shows downward arrow M4 (step 86; With reference to Fig. 7), and judge whether the user has interrupted touch (step 87).Under the occasion that CPU core 21 has interrupted touching the user, make to handle to enter next procedure 88, and proceeding under the occasion of touch operation, then make to handle to enter step 89.
In step 88, CPU core 21 based on current the 5th mode flags DF5 that is set at connection, is carried out the 5th pattern, and finishes the processing based on this subroutine.Here, so-called the 5th pattern means the processing that shows that illustrates with the right figure of Fig. 7, is corresponding with the mark of touch operation, and shows the processing of the out of Memory that a LCD11 is shown at the 2nd LCD12.In addition, in the 5th pattern, a LCD11 and the 2nd LCD12 all show the information corresponding with mark, but at a LCD11, also can be presented at the prime information that shows before this touch operation once more.
In step 89, CPU core 21 is judged current touch input coordinate DC1, whether is in the zone on the judgement line used in the abovementioned steps 83.DC1 is under the occasion of judging the zone on the line at the touch input coordinate, and CPU core 21 makes to handle and enters next step 90.And be under the occasion of judging the zone under the line, if the 5th mode flags DF5 is set to connection, then CPU core 21 turns back to abovementioned steps 86 re-treatments, if the 5th mode flags DF5 is set to disconnection, then turns back to abovementioned steps 85 re-treatments (step 94).
In step 90, CPU core 21 is set at connection with the 6th mode flags DF6 that is stored among the RAM24, and the 5th mode flags DF5 is set at disconnection.CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, near current touch input coordinate DC1, shows arrow M5 (step 91 up and down; With reference to Fig. 8), and judge whether the user has interrupted touch (step 92).Under the occasion that CPU core 21 has interrupted touching the user, make to handle to enter next procedure 93, and proceeding under the occasion of touch operation, then turn back to abovementioned steps 89 re-treatments.
In step 93, CPU core 21 based on current the 6th mode flags DF6 that is set at connection, is carried out the 6th pattern, and finishes the processing based on this subroutine.Here, so-called the 6th pattern means the processing that shows that illustrates with the right figure of Fig. 8, is to show former display message among the 2nd LCD12 at a LCD11, and corresponding with the mark of touch operation, show the processing of the out of Memory that a LCD11 is shown at the 2nd LCD12.
Here, in abovementioned steps 90, after the 6th mode flags DF6 is set to connection, carry out slide, and in the time of till the zone below aforementioned judgement line (, be not in abovementioned steps 89), carry out abovementioned steps 85.Therefore, the 6th mode flags DF6 is set to disconnection, and once more the 5th mode flags DF5 is set at connection.Promptly, from judge employing till the zone of zone to this judgement line below the line slide gesture upward of touch panel 15 be cancelled, slide downwards till the zone under having only from initial point coordinate DC2 to this judgement line is considered to effective operating gesture.
Among Figure 16, during background designated treatment in carrying out abovementioned steps 52, at first, CPU core 21 is judged current whether being set to around pattern (step 101).Next, not that CPU core 21 changes processing over to next step 111 (Figure 17) under the occasion of pattern.And be under the occasion of pattern, CPU core 21 changes processing over to next step 102.Here, so-called around pattern, as utilize Fig. 9 illustrated like that, mean that the user is to centering on the pattern that has shifted when pattern has been carried out slide with icon Ia.
In step 102, CPU core 21 judges whether current touch input coordinate DC1 is in the initial point near zone.Next, CPU core 21 judges whether form around curve (step 103) when connecting trajectory coordinates DC3 by the time sequence respectively.DC1 is in the initial point near zone at the touch input coordinate, and trajectory coordinates DC3 group forms, and (abovementioned steps 102 and step 103 are and are under the occasion of curve; With reference to the left figure of Fig. 9), CPU core 21 just makes to handle and changes next step 105 over to.And be not in the initial point near zone at touch input coordinate DC1, and perhaps trajectory coordinates DC3 group does not form under the occasion of curve (abovementioned steps 102 and step 103 are not), and then CPU core 21 judges just whether the user has interrupted touch (step 104).Under the occasion of having interrupted touching the user, CPU core 21 constipation bundles are based on the processing of this subroutine.And proceeding under the occasion of touch operation, CPU core 21 as trajectory coordinates DC3, appends current touch input coordinate DC1 to store RAM24 (step 109) into, and turns back to abovementioned steps 101 continuation processing.
In step 105, CPU core 21 is set at connection with the 7th mode flags DF7 that is stored among the RAM24.CPU core 21 utilizes trace image DI2, on the shown information of the 2nd LCD12, on current trajectory coordinates DC3, shows track M6 (step 106; With reference to Fig. 9), and judge whether the user has interrupted touch (step 107).Under the occasion that CPU core 21 has interrupted touching the user, make to handle to enter next procedure 108, and proceeding under the occasion of touch operation, then turn back to abovementioned steps 102 and continue to handle.
In step 108, CPU core 21 is carried out the 7th pattern based on current the 7th mode flags DF7 that is set to connection, and finishes the processing based on this subroutine.In addition, CPU core 21 after carrying out the 7th pattern, the trajectory coordinates DC3 that is stored among the Delete All RAM24.Here, so-called the 7th pattern means the processing that shows that illustrates with the right figure of Fig. 9, is the display mode that does not change the 2nd LCD12, amplifies the information that track M6 is centered on, and the processing that shows on a LCD11.
Among Figure 17, in the background designated treatment in abovementioned steps 52, and not aforementioned under the occasion of pattern (being not in abovementioned steps 101), CPU core 21 execution in step 111.In step 111, CPU core 21 judges whether current touch input coordinate DC1 is in upper area AT (with reference to Figure 11).DC1 is under the occasion of upper area AT at the touch input coordinate, and CPU core 21 makes to handle and enters next step 112.And be not under the occasion of upper area AT at touch input coordinate DC1, CPU core 21 makes to handle and enters next step 116.Here, (x1 y1) is benchmark to CPU core 21, and upper area AT used in the step 111 is set at x1-k1≤x≤x1+k1, and y 〉=y1+k2 (k1, k2 are constant) with initial point coordinate DC2.If (xt yt) is x1-k1≤xt≤x1+k1 to current touch input coordinate DC1, and yt 〉=y1+k2, and then CPU core 21 is judged as, and current touch input coordinate DC1 is in the upper area AT.
In step 112, CPU core 21 is set at connection with the 8th mode flags DF8 that is stored among the RAM24, and the 9th~the tenth two modes sign DF9~DF12 is set at disconnection.CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, near current touch input coordinate DC1, shows to upward arrow M8, and at the periphery of initial point coordinate DC2, shows guide image M 7 (steps 113; With reference to Figure 10), and judge whether the user has interrupted touch (step 114).Under the occasion that CPU core 21 has interrupted touching the user, make to handle to enter next procedure 115, and proceeding under the occasion of touch operation, then make to handle to enter step 116.
In step 115, CPU core 21 based on current the 8th mode flags DF8 that is set to connection, is carried out the 8th pattern, and finishes the processing based on this subroutine.Here, so-called the 8th pattern means the processing that shows that illustrates with the right figure of Figure 10, is the display message that keeps the 2nd LCD12, and also shows the information processing that the 2nd LCD1 is shown at a LCD11.
In step 116, CPU core 21 judges whether current touch input coordinate DC1 is in regulation zone AM (with reference to Figure 11).Be not under the occasion of regulation zone AM at touch input coordinate DC1, CPU core 21 makes to handle and enters next step 131 (Figure 18).And be under the occasion of regulation zone AM at touch input coordinate DC1, if the 8th mode flags DF8 is set to connection, CPU core 21 makes to handle and enters next step 118, if the 8th mode flags DF8 is set to disconnection, then makes to handle to enter next step 131 (step 117).Here, CPU core 21, (x1 y1) is benchmark, and regulation zone AM used in the step 116 is set at x1-k1≤x≤x1+k1, and y1-k2<y<y1+k2 (k1, k2 are constant) with initial point coordinate DC2.If (xt yt) is x1-k1≤xt≤x1+k1 to current touch input coordinate DC1, and y1-k2<yt<y1+k2, and then CPU core 21 is judged as, and current touch input coordinate DC1 is in regulation zone AM.
In step 118, CPU core 21 is set at connection with the 9th mode flags DF9 that is stored among the RAM24, and the 8th mode flags DF8 is set at disconnection.CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, near current touch input coordinate DC1, shows arrow M9 up and down, and at the periphery of initial point coordinate DC2, shows guide image M 7 (steps 119; With reference to Figure 12), and judge whether the user has interrupted touch (step 120).Under the occasion that CPU core 21 has interrupted touching the user, make to handle to enter next procedure 121, and proceeding under the occasion of touch operation, then make to handle to enter step 131 (Figure 18).
In step 121, CPU core 21 is carried out the 9th pattern based on current the 9th mode flags DF9 that is set to connection, and finishes the processing based on this subroutine.Here, so-called the 9th pattern means the processing that shows that illustrates with the right figure of Figure 12, is the processing of the shown information Alternation Display of information that a LCD11 is shown and the 2nd LCD12 at a LCD11 and the 2nd LCD12.
In the step 131 of Figure 18, CPU core 21 judges whether current touch input coordinate DC1 is in lower zone AB (with reference to Figure 11).DC1 is not under the occasion of lower zone AB at the touch input coordinate, and CPU core 21 makes to handle and enters next step 136.And be under the occasion of lower zone AB at touch input coordinate DC1, CPU core 21 makes to handle and enters next step 132.Here, (x1 y1) is benchmark to CPU core 21, and lower zone AB used in the step 131 is set at x1-k1≤x≤x1+k1, and y≤y1-k2 (k1, k2 are constant) with initial point coordinate DC2.Like this, if (xt yt) is x1-k1≤xt≤x1+k1 to current touch input coordinate DC1, and yt≤y1-k2, and then CPU core 21 is judged as, and current touch input coordinate DC1 is in lower zone AB.
In step 132, CPU core 21 is set at connection with the tenth mode flags DF10 that is stored among the RAM24, and the 8th, the the 9th, the 11, the tenth two modes sign DF8, DF9, DF11, DF12 are set at disconnection.CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, near current touch input coordinate DC1, show downward arrow M10 (not shown), and at the periphery of initial point coordinate DC2, show guide image M 7 (with reference to Figure 10) (steps 133), and judge whether the user has interrupted touch (step 134).Under the occasion that CPU core 21 has interrupted touching the user, make to handle to enter next procedure 135, and proceeding under the occasion of touch operation, then make to handle to enter step 136.
In step 135, CPU core 21 based on current the tenth mode flags DF10 that is set to connection, is carried out the tenth pattern, and finishes the processing based on this subroutine.Here, so-called the tenth pattern means that the display message among the LCD11 still keeps, and also shows the information processing that a LCD11 is shown at the 2nd LCD12 simultaneously.Promptly, in the tenth pattern, display message among the one LCD11 still keeps, while is presented on the display part (the 2nd LCD12) that is carried on this slide direction in the shown information (information that a LCD11 is shown) of starting point side of user's slide direction (from the below of a LCD11 towards the 2nd LCD12).
In step 136, CPU core 21 judges whether current touch input coordinate DC1 is in left field AL (with reference to Figure 11).DC1 is not under the occasion of left field AL at the touch input coordinate, and CPU core 21 makes to handle and enters next step 141.And be under the occasion of left field AL at touch input coordinate DC1, CPU core 21 makes to handle and enters next step 137.Here, CPU core 21, (x1 y1) is benchmark, and left field AL used in the step 136 is set at x<x1-k1 (k1 is a constant) with initial point coordinate DC2.Like this, if (xt yt) is xt<x1-k1 to current touch input coordinate DC1, and then CPU core 21 is judged as, and current touch input coordinate DC1 is in the left field AL.
In step 137, CPU core 21 is set at connection with the 11 mode flags DF11 that is stored among the RAM24, and the 8th~the tenth mode flags DF8~DF10 and the tenth two modes sign DF12 are set at disconnection.CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, near current touch input coordinate DC1, show left-hand arrow M11 (not shown), and at the periphery of initial point coordinate DC2, show guide image M 7 (with reference to Figure 10) (steps 138), and judge whether the user has interrupted touch (step 139).Under the occasion that CPU core 21 has interrupted touching the user, make to handle to enter next procedure 140, and proceeding under the occasion of touch operation, then make to handle to enter step 141.
In step 140, CPU core 21 based on current the 11 mode flags DF11 that is set to connection, is carried out the 11 pattern, and finishes the processing based on this subroutine.Here, so-called the 11 pattern, mean slide to left according to the user, display message among the one LCD11 still keeps, simultaneously the 2nd LCD12 be presented at the shown information of the 2nd LCD12 in the previous processing and with the shown information of current the 2nd LCD12 before the suitable information processing of page or leaf.Such as, in the 11 pattern, in the shown information of the 2nd LCD12, the what is called that can carry out browser " is returned " operation of page or leaf before the demonstration in operation and the e-book etc.In addition, show at the 2nd LCD12 under the occasion of e-book, in horizontal version e-book, the indication left generally shows prevpage, but also can constitute (such as perpendicular version, caricature, foreign language document), the instruction of demonstration prevpage is set at slide to left according to the document of shown e-book.
In step 141, CPU core 21 judges whether current touch input coordinate DC1 is in right side area AR (with reference to Figure 11).DC1 is not under the occasion of right side area AR at the touch input coordinate, and CPU core 21 turns back to abovementioned steps 111 (Figure 17), proceeds to handle.And be under the occasion of right side area AR at touch input coordinate DC1, CPU core 21 makes to handle and enters next step 142.Here, (x1 y1) is benchmark to CPU core 21, and right side area AR used in the step 141 is set at x>x1+k1 (k1 is a constant) with initial point coordinate DC2.If (xt yt) is xt>x1+k1 to current touch input coordinate DC1, and then CPU core 21 is judged as, and current touch input coordinate DC1 is in the right side area AR.
In step 142, CPU core 21 is set at connection with the tenth two modes sign DF12 that is stored among the RAM24, and the 8th~the 11 mode flags DF8~DF11 is set at disconnection.CPU core 21 utilizes indicatrix as DI1, on the shown information of the 2nd LCD12, near current touch input coordinate DC1, show right arrow M12 (not shown), and at the periphery of initial point coordinate DC2, show guide image M 7 (with reference to Figure 10) (steps 143), and judge whether the user has interrupted touch (step 144).Under the occasion that CPU core 21 has interrupted touching the user, make to handle to enter next procedure 145, and proceeding under the occasion of touch operation, then make to handle to enter step 111.
In step 145, CPU core 21 based on current the tenth two modes sign DF12 that is set to connection, is carried out the tenth two modes, and finishes the processing based on this subroutine.Here, so-called the tenth two modes, mean according to the user to right-hand slide, display message among the one LCD11 still keeps, simultaneously as handling next time, the 2nd LCD12 show past information that the 2nd LCD12 is shown and with the suitable information processing of following one page of the shown information of current the 2nd LCD12.Such as, in the tenth two modes, in the shown information of the 2nd LCD12, the what is called that can the carry out browser nextpage display operation in operation and the e-book etc. that " advances ".In addition, show at the 2nd LCD12 under the occasion of e-book, in horizontal version e-book, indicate one page under the right-hand general demonstration, but also can constitute, the instruction that shows prevpage is set to right-hand slide according to the document of shown e-book.
Here, when the background designated treatment that centers on beyond the pattern, after any one of the 8th~the tenth two modes sign DF8~DF12 is set to connection, from the zone of current touch when slide is carried out in other zone, the mode flags that is set to connection just is set to disconnection, and other mode flags is set to connection.Such as, in abovementioned steps 112, after the 8th mode flags DF8 is set to connection, carry out slide, in the time of till reaching left field AL (, in abovementioned steps 136 for being), the 8th mode flags DF8 is set to disconnection, and the 11 mode flags DF11 is set to connection.Under this occasion, from initial point coordinate DC2 to the top employing of zone till the AT gesture of slide upward of touch panel 15 be cancelled, the slide to left till having only from initial point coordinate DC2 to left field AL is counted as effective operating gesture.That is, can easily cancel the slide gesture of the operation that is used to stipulate, and can be only by the slide till other zone, the slide till only will be from initial point coordinate DC2 to this zone is regarded effective operating gesture as.
In addition, although in the above-described configuration,, can discern the above multi-direction operating gesture of five directions but also can constitute can discern the gesture of the four direction that has adopted touch panel 15.By to being other further segmentation of each zone at center with regulation zone AM, can discern the above gesture of five directions that has adopted touch panel 15.Under this occasion, guide image M 7, the border as the regional AM of regulation shows (with reference to Figure 10) with rectangular patterns, although it utilizes touch panel 15 to discern the target image of effective operating gesture as being used for the user, also can show with the pattern of other shape.Such as, not only can discern the four direction gesture that has adopted touch panel 15 as mentioned above but also can discern under the occasion of multi-direction operating gesture, also can show with distinguishing the corresponding polygon pattern of quantity with this direction.
Like this, in game machine 1, in the three-mode of step 74, the 6th pattern of step 93, the 7th pattern of step 108, the 9th pattern of step 121 etc., according to combination (such as above-below direction) from the corresponding a plurality of slide directions of the input of touch panel 15, discern user's gesture, and the processing of decision execution.Therefore, by distinguish user's operating gesture by single slide direction, can increase the variation of handling next time.On the other hand, can easily cancel the gesture of slide, and can be only by the slide till other zone, the slide till only will be from the initial point coordinate to this zone is regarded effective operating gesture as.Therefore, under the unsuitable occasion of slide that the user carried out, the user can easily cancel this operation, and can provide new operation indication by operating continuous operation with this.That is,,, increase the variation of handling assignable next time, can reduce being not suitable for of under the occasion of having described inappropriate track, being produced, and can improve the operability of coordinate input by the coordinate input operation according to the present invention.In addition, the operating gesture owing to being discerned by the coordinate input operation that the user imported shows as index near touch location, thereby the operating gesture that can discern the user reliably and imported and handling next time.
In addition, in game machine 1, adopted the slide of touch panel 15, can pass through intuitive operation, carried out the high display process of serviceability according to the user.Such as, when the user utilizes touch panel 15 to carry out slide upward, in the display message in keeping the 2nd LCD12, resulting image in the processing corresponding with initial point coordinate DC2 (response image, enlarged image, the shown image of the 2nd LCD12) can be presented at the LCD11 on the slide direction (from the top of the 2nd LCD12 towards a LCD11).In addition, when the user utilizes touch panel 15 to carry out slide downwards, resulting image in the processing corresponding with initial point coordinate DC2 can be presented at the 2nd LCD12 on the slide direction (from the below of a LCD11 towards the 2nd LCD12).In addition, when utilizing touch panel 15 to carry out above-below direction, the user reciprocatingly slides when operation, image shown among the 2nd LCD12 just can be presented at a LCD11, resulting image in the processing corresponding (response image, the shown image of a LCD11) with initial point coordinate DC2, can be presented at a LCD11, thereby shown information in the display part up and down (LCD11 and the 2nd LCD12) on the slide direction (the reciprocal above-below direction of direction), but just Alternation Display at a LCD11 and the 2nd LCD12 place.The user just has intuitive to the slide direction of touch panel 15 inputs and corresponding therewith display process.And, keep former display message, in the information that other display part shows and processing is corresponding, this point all can be brought into play serviceability under various occasions simultaneously.
In addition, in aforementioned,, utilize concrete sequence of operation to understand each input coordinate processing, but they are a kind of embodiment, need not go into the details, and the invention is not restricted to these sequence of operation for specifying.Such as, also can further increase and can be used as the slide that the operating gesture on the touch panel 15 is discerned.
In addition, for specifying, utilization this example of data such as disclosed document and image on reading network on a LCD11 and the 2nd LCD12 is illustrated, but out of Memory and image are shown on a LCD11 and the 2nd LCD12 in aforementioned.Such as, also can be on a LCD11 and the 2nd LCD12, showing reads with game machine 1 records this e-books of books such as numerical data dictionary in the electronic medium.
In addition, in aforesaid embodiment, as picture being divided into two-part liquid crystal display part one example, illustrated is a LCD11 and the mutual occasion (occasion of two pictures up and down) of configuration up and down of the 2nd LCD12 of will separate on the entity.Yet, picture is divided into the formation of two-part display frame, also can adopt other formation.Such as, also can be on the interarea of following side body 13b, about configuration the one LCD11 and the 2nd LCD12.In addition, also can constitute: will with the horizontal wide identical and lengthwise of the 2nd LCD12 longitudinal LCD that is twice (promptly, on the entity be one, but show that length is at the LCD that vertically is divided into two pictures), be configured to down on the interarea of side body 13b, and showing first and second display image (that is, no up-and-down boundary partly comes adjacent demonstration) up and down.In addition, can also constitute: will be vertical wide identical and grow crosswise and be the shape LCD that grows crosswise of twice with the 2nd LCD12, be configured to down on the interarea of side body 13b, and about laterally, show first and second display image (that is, no left and right sides boundary portion assign to adjacent demonstration).That is, the picture that on the entity is can be divided into two parts and use, show first and second display image thus.For any one game image mode,, then can realize the present invention equally if in the display frame of second display image, dispose touch panel 15.In addition, be that one picture is divided into two parts and uses on entity, show thus under the occasion of first and second display image, also can dispose touch panel 15 on the whole at this picture.
In addition, in the foregoing embodiments, on game machine 1, be provided with touch panel 15 integratedly, but need not go into the details,, also can realize the present invention even separately constitute game machine and touch panel.In addition, also touch panel 15 can be set on a LCD11.In addition, in the aforementioned embodiment, display frame is made as two (LCD11, the 2nd LCD12), but, also display frame can be made as one according to mode of the present invention.That is, in the aforementioned embodiment, also a LCD11 can be set, and only with the 2nd LCD12 as display frame, touch panel 15 is set.In addition, in the aforementioned embodiment, also the 2nd LCD12 can be set, and touch panel 15 only is set on a LCD11.
In addition, in the aforementioned embodiment,, used touch panel, but also can adopt other pointing device as the input medium of game machine 1.Here, so-called pointing device, mean the input media that the input position on the picture and coordinate are carried out appointment, such as, if mouse, tracking plate, tracking ball etc. are used as input medium, and adopt the information of the picture coordinate system that calculates according to output valve from input medium output, then can realize the present invention equally.
In addition, in the aforementioned embodiment, on game machine 1, be provided with touch panel 15 integratedly, but also can adopt the signal conditioning packages (input coordinate treating apparatus) such as ordinary individual's computing machine that make touch panel become one of input medium.
Input coordinate handling procedure of the present invention and input coordinate treating apparatus, can increase and utilize the discernible user's operating gesture of pointing device, simultaneously can be easy to cancel the operating gesture of being imported, and can be used as information such as disclosed document and image on the reading network or on display device, show the signal conditioning package such as game machine of display message such as electronic document and this signal conditioning package in performed program etc.In addition, input coordinate handling procedure of the present invention and input coordinate treating apparatus, can adopt the operation of pointing device according to the user, operate by intuitive and to carry out the high display process of serviceability, and can be used as information such as disclosed document and image on the reading network or on display device, show the signal conditioning package such as game machine of display message such as e-book and this signal conditioning package in the program carried out etc.
More than, describe the present invention in detail, but above stated specification only in every respect illustration the present invention, do not limit its scope.Need not go into the details, in not departing from the scope of the present invention, can carry out various improvement and distortion.
In this manual, quoted the special hope of Japanese patent application No. 2005-109986 number and special content of being willing to 2005-109987 number.

Claims (14)

1. input coordinate processing method, it is by operating from the input coordinate of pointing device output according to user's operation, and this input coordinate processing method is characterised in that, comprising:
Input coordinate detects step, and it detects the coordinate information based on the regulation coordinate system according to the input coordinate from aforementioned pointing device output;
The reference coordinate storing step, it is set and the Memory Reference coordinate based on detect the initial coordinate information in detected a series of coordinate informations in the step at aforementioned input coordinate;
Multizone is set step, and it forms a plurality of zones with aforementioned reference coordinate as benchmark, and the zone that will comprise this reference coordinate is set at reference area;
Handle deciding step, it decides processing based on detect last coordinate information and the aforementioned reference coordinate in detected a series of coordinate informations in the step at aforementioned input coordinate,
The aforementioned processing deciding step comprises:
First handles deciding step, when it represents in the zone outside the aforementioned reference area at aforementioned last coordinate information, based on deciding processing from the direction of aforementioned reference coordinate to the zone shown in this last coordinate information;
Second handles deciding step, it detects after part coordinate information in detected a series of coordinate informations in the step represented in the zone outside the aforementioned reference area at aforementioned input coordinate, when aforementioned last coordinate information is represented in the aforementioned reference area, based on from aforementioned reference coordinate to the direction in the zone shown in this part coordinate information, and, decide processing from the combination of the zone shown in this part coordinate information to the direction of aforementioned reference coordinate.
2. input coordinate processing method according to claim 1 is characterized in that:
The aforementioned processing deciding step also comprises:
Sign is set step, and it is regional when mobile to other from aforementioned reference area in the position shown in the aforementioned last coordinate information, sets and store this other area relative sign;
The first sign step of updating, it has set sign in aforementioned sign setting step after, in the position shown in the aforementioned last coordinate information from the zone that is set with this sign when other zone that is different from aforementioned reference area is moved further, the sign of current setting is updated to and the regional corresponding sign shown in this last coordinate information, and is stored;
The second sign step of updating, it has set sign in aforementioned sign setting step or the aforementioned first sign step of updating after, in the position shown in the aforementioned last coordinate information from the zone that is set with this sign when aforementioned reference area is moved further, the sign of current setting is updated to aforementioned reference area of expression and the reciprocal sign of having set the zone of this sign, and stored, in addition
Aforementioned first handles deciding step, based on set the sign that step or the aforementioned first sign step of updating set by aforementioned sign, decides processing,
Aforementioned second handles deciding step, and the sign based on being set by the aforementioned second sign step of updating decides processing.
3. input coordinate processing method according to claim 1, it is characterized in that, also comprise the demonstration controlled step, this step shows the index of representing following direction respectively in display device, and this direction is to have determined the aforementioned reference coordinate handled to handle to other the regional direction shown in the aforementioned last coordinate information and aforementioned second the deciding step to have determined the combinations of directions of handling in the deciding step from handling aforementioned first.
4. input coordinate processing method according to claim 1 is characterized in that, also comprises the demonstration controlled step, and this step is shown in the image of the reference area of setting in aforementioned a plurality of zone enactment steps at the display device indicator gauge.
5. input coordinate processing method according to claim 1 is characterized in that:
Also comprise the demonstration controlled step, this step shows image based on aforementioned coordinate system in display device,
Aforementioned a plurality of zone enactment steps will be that the aforementioned image region segmentation of the periphery at center becomes a plurality of with aforementioned reference area, and form aforementioned a plurality of zone.
6. input coordinate processing method, it is by operating from the input coordinate of pointing device output according to user's operation, and this input coordinate processing method is characterised in that, comprising:
Input coordinate detects step, and it detects the coordinate information based on the regulation coordinate system according to the input coordinate from aforementioned pointing device output;
The reference coordinate storing step, it is set and the Memory Reference coordinate based on detect the initial coordinate information in detected a series of coordinate informations in the step at aforementioned input coordinate;
First handles deciding step, and it is based on from the position shown in the aforementioned reference coordinate, and the direction to detect the position shown in the last coordinate information in detected a series of coordinate informations in the step at aforementioned input coordinate decides processing;
Second handles deciding step, it is based on from the position shown in the aforementioned reference coordinate, to detecting in the step in the way in detected a series of coordinate informations the direction of the position shown in the coordinate information at aforementioned input coordinate and the position shown in the coordinate information from this way, combination to the direction of the position shown in the aforementioned reference coordinate decides processing;
Show controlled step, it is at the image of display device demonstration based on aforementioned coordinate system, and near this image the position shown in the aforementioned last coordinate information, the index that shows the following direction of expression respectively, this direction are to handle aforementioned first to have determined the direction of handling in the deciding step and determined the combinations of directions of handling in the aforementioned second processing deciding step.
7. input coordinate processing method, it is by from the input coordinate of pointing device output, coming display frame one by one on first display part that constitutes display device and second display part according to user's operation, and this input coordinate processing method is characterised in that, comprising:
The input coordinate storing step, it detects and stores the coordinate information based on the displaing coordinate system of the shown image of aforementioned display according to the input coordinate from aforementioned pointing device output;
The reference coordinate storing step, the initial coordinate information in its a series of coordinate informations that will store in aforementioned input coordinate storing step is set at reference coordinate, and is stored;
Treatment step, it obtains display image based at least one coordinate information in aforementioned a series of coordinate informations;
Show controlled step, its former reference coordinate of stating is a benchmark, direction of operating shown in aforementioned a series of coordinate informations in the aforementioned displaing coordinate system, expression when the reference direction of the position of aforementioned second display part, is presented at the display image of obtaining in the aforementioned processing step from the position of aforementioned first display part on this second display part.
8. input coordinate processing method according to claim 7 is characterized in that:
Aforementioned input coordinate processing method is based on aforementioned input coordinate, carries out coming file in download and the display process of the web browser on aforementioned display, read via the Department of Communication Force that communicates with network,
In the aforementioned processing step, when aforementioned reference coordinate is positioned at the response region of web browser, obtain and this response region corresponding response image,
In the aforementioned demonstration controlled step, when the aforementioned reference direction of aforementioned operation direction indication, on aforementioned second display part, show aforementioned response image.
9. input coordinate processing method according to claim 8 is characterized in that:
In the aforementioned processing step, when aforementioned reference coordinate is positioned at the background area of web browser, obtain at the shown parts of images at least of aforementioned first display part,
In the aforementioned demonstration controlled step, when the aforementioned reference direction of aforementioned operation direction indication, on aforementioned second display part, show the shown parts of images at least of aforementioned first display part.
10. input coordinate processing method according to claim 8 is characterized in that:
Aforementioned reference direction is last direction or the following direction that is provided with aforementioned second display part for aforementioned first display part of aforementioned display,
In the aforementioned processing step, be positioned at the background area of web browser in aforementioned reference coordinate, and aforementioned operation direction indication left to or during right, in this treatment step, obtain the image of obtaining in the past,
In the aforementioned demonstration controlled step, aforementioned operation direction indication left to or during right, on aforementioned first display part, show aforementioned obtained in the past image.
11. input coordinate processing method according to claim 7 is characterized in that:
In the aforementioned processing step, at the track of aforementioned a series of coordinate informations during, obtain the enlarged image that the image in this regulation zone has been amplified around the shown regulation zone of aforementioned first display part,
In the aforementioned demonstration controlled step, on aforementioned second display part, show aforementioned enlarged image.
12. input coordinate processing method according to claim 7 is characterized in that:
Aforementioned reference direction is last direction or the following direction that is provided with aforementioned second display part for aforementioned first display part of aforementioned display,
In the aforementioned processing step, aforementioned operation direction indication left to or during right, obtain other page-images that is different from the shown page of aforementioned first display part,
In the aforementioned demonstration controlled step, aforementioned operation direction indication left to or during right, on aforementioned first display part, show aforementioned other page-images.
13. input coordinate processing method according to claim 7 is characterized in that:
Based on from the pairing coordinate information that is based on aforementioned displaing coordinate of the input coordinate of aforementioned pointing device output, can only be set in one of shown image of shown image of aforementioned first display part and aforementioned second display part.
14. input coordinate processing method, it is based on according to user's operation and from the input coordinate of pointing device output, come display frame one by one on first display part that constitutes display device and second display part, this input coordinate processing method is characterised in that, carries out:
The input coordinate storing step, it detects and stores the coordinate information based on the displaing coordinate system of the shown image of aforementioned display according to the input coordinate from aforementioned pointing device output;
The reference coordinate storing step, the initial coordinate information in its a series of coordinate informations that will store in aforementioned input coordinate storing step is set at reference coordinate, and is stored;
Treatment step, it obtains display image based at least one coordinate information in aforementioned a series of coordinate informations;
Show controlled step, its former reference coordinate of stating is a benchmark, direction of operating shown in aforementioned a series of coordinate informations in the aforementioned displaing coordinate system, when being illustrated in the reciprocal reciprocating direction of the configuration direction of aforementioned first display part and aforementioned second display part, on aforementioned first display part, show the parts of images at least of the shown mistake of this second display part, on aforementioned second display part, be presented at display image obtained in the aforementioned processing step simultaneously.
CNB2005101329549A 2005-04-06 2005-12-29 input coordinate processing method Active CN100538613C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2005109987 2005-04-06
JP2005109986 2005-04-06
JP2005109986A JP4719494B2 (en) 2005-04-06 2005-04-06 Input coordinate processing program and input coordinate processing apparatus

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN2009100086919A Division CN101655753B (en) 2005-04-06 2005-12-29 input coordinate processing method

Publications (2)

Publication Number Publication Date
CN1845051A true CN1845051A (en) 2006-10-11
CN100538613C CN100538613C (en) 2009-09-09

Family

ID=37063983

Family Applications (2)

Application Number Title Priority Date Filing Date
CNB2005101329549A Active CN100538613C (en) 2005-04-06 2005-12-29 input coordinate processing method
CN2009100086919A Active CN101655753B (en) 2005-04-06 2005-12-29 input coordinate processing method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN2009100086919A Active CN101655753B (en) 2005-04-06 2005-12-29 input coordinate processing method

Country Status (2)

Country Link
JP (1) JP4719494B2 (en)
CN (2) CN100538613C (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102099777A (en) * 2008-07-25 2011-06-15 日本电气株式会社 Information processing device, information processing program, and display control method
CN102473043A (en) * 2009-07-30 2012-05-23 夏普株式会社 Portable display device, method of controlling portable display device, program, and recording medium
CN101765826B (en) * 2007-09-27 2012-07-18 株式会社日立医药 Information display device
CN102742168A (en) * 2010-02-04 2012-10-17 三星电子株式会社 Mobile device with dual display units and method for controlling the dual display units
CN102754066A (en) * 2010-02-10 2012-10-24 三星电子株式会社 Mobile terminal having multiple display units and data handling method for the same
CN102782631A (en) * 2010-02-10 2012-11-14 三星电子株式会社 Screen control method and apparatus for mobile terminal having multiple touch screens
US8313375B2 (en) 2007-09-28 2012-11-20 Konami Digital Entertainment Co., Ltd. Input instruction processing device, communication system therefor, computer program therefor, and information recording medium therewith
CN104111790A (en) * 2013-04-17 2014-10-22 富士通株式会社 Display device
CN104216613A (en) * 2008-06-30 2014-12-17 日本电气株式会社 Information processing device, display control method, and recording medium
US8966400B2 (en) 2010-06-07 2015-02-24 Empire Technology Development Llc User movement interpretation in computer generated reality

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008152362A (en) 2006-12-14 2008-07-03 Konami Digital Entertainment:Kk Game program, game device and game control method
JP4971908B2 (en) * 2007-08-24 2012-07-11 任天堂株式会社 Information processing program and information processing apparatus
JP5334171B2 (en) * 2009-01-30 2013-11-06 シャープ株式会社 Electronic device and display control method
KR101601049B1 (en) * 2010-02-10 2016-03-08 삼성전자주식회사 Portable terminal having dual display unit and method for providing clipboard function therefor
US9405444B2 (en) 2010-10-01 2016-08-02 Z124 User interface with independent drawer control
JP5790380B2 (en) * 2011-09-28 2015-10-07 株式会社Jvcケンウッド Electronic device, control method of electronic device, and program
TWI475473B (en) 2012-02-17 2015-03-01 Mitac Int Corp Method for generating split screen according to a touch gesture
JP5939437B2 (en) * 2012-06-11 2016-06-22 コニカミノルタ株式会社 Operation display device, image processing device

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58116377A (en) * 1981-12-28 1983-07-11 任天堂株式会社 Handheld type game apparatus
JPH04369027A (en) * 1991-06-18 1992-12-21 Fujitsu Ltd Portable information processor having two screens
JPH05127819A (en) * 1991-10-31 1993-05-25 Nec Corp Stroke command supporting system
JP3579061B2 (en) * 1992-08-31 2004-10-20 株式会社東芝 Display device
JPH0876926A (en) * 1994-09-02 1996-03-22 Brother Ind Ltd Picture display device
US5847698A (en) * 1996-09-17 1998-12-08 Dataventures, Inc. Electronic book device
US6313853B1 (en) * 1998-04-16 2001-11-06 Nortel Networks Limited Multi-service user interface
JP2000010655A (en) * 1998-06-22 2000-01-14 Toshiba Corp Portable information equipment
JP2000163193A (en) * 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
JP2000267813A (en) * 1999-03-17 2000-09-29 Sharp Corp Touch panel input type electronic equipment
JP2000278373A (en) * 1999-03-29 2000-10-06 Ricoh Co Ltd Portable electronic equipment
JP2001005438A (en) * 1999-06-21 2001-01-12 Sony Corp Display device and its method
JP2002032211A (en) * 2000-05-08 2002-01-31 Fujitsu Ltd Information display device, medium and program
JP2002091688A (en) * 2000-09-12 2002-03-29 Fuji Xerox Co Ltd Method and device for supporting input of stroke command

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101765826B (en) * 2007-09-27 2012-07-18 株式会社日立医药 Information display device
US8970526B2 (en) 2007-09-28 2015-03-03 Konami Digital Entertainment Co., Ltd. Input instruction processing device, communication system therefor, computer program therefor, and information recording medium therewith
US8313375B2 (en) 2007-09-28 2012-11-20 Konami Digital Entertainment Co., Ltd. Input instruction processing device, communication system therefor, computer program therefor, and information recording medium therewith
CN104216613B (en) * 2008-06-30 2018-02-09 日本电气株式会社 Message processing device, display control method and recording medium
CN104216613A (en) * 2008-06-30 2014-12-17 日本电气株式会社 Information processing device, display control method, and recording medium
CN102099777B (en) * 2008-07-25 2014-01-29 日本电气株式会社 Information processing device and display control method
CN102099777A (en) * 2008-07-25 2011-06-15 日本电气株式会社 Information processing device, information processing program, and display control method
CN102473043B (en) * 2009-07-30 2014-11-26 夏普株式会社 Portable display device, and method of controlling portable display device
CN102473043A (en) * 2009-07-30 2012-05-23 夏普株式会社 Portable display device, method of controlling portable display device, program, and recording medium
CN102742168A (en) * 2010-02-04 2012-10-17 三星电子株式会社 Mobile device with dual display units and method for controlling the dual display units
CN102742168B (en) * 2010-02-04 2015-06-17 三星电子株式会社 Mobile device with dual display units and method for controlling the dual display units
CN102782631A (en) * 2010-02-10 2012-11-14 三星电子株式会社 Screen control method and apparatus for mobile terminal having multiple touch screens
CN102754066A (en) * 2010-02-10 2012-10-24 三星电子株式会社 Mobile terminal having multiple display units and data handling method for the same
US8966400B2 (en) 2010-06-07 2015-02-24 Empire Technology Development Llc User movement interpretation in computer generated reality
CN104111790A (en) * 2013-04-17 2014-10-22 富士通株式会社 Display device

Also Published As

Publication number Publication date
JP4719494B2 (en) 2011-07-06
CN100538613C (en) 2009-09-09
CN101655753A (en) 2010-02-24
JP2006293476A (en) 2006-10-26
CN101655753B (en) 2011-11-30

Similar Documents

Publication Publication Date Title
CN1845051A (en) Input position processing device
CN102725711B (en) Edge gesture
EP1912112B1 (en) Storage medium storing input position processing program, and input position processing device
JP4435011B2 (en) Input coordinate processing program and input coordinate processing apparatus
CN1276373C (en) Apparatus and method for information processing and recording medium and program used therefor
CN1841373A (en) Electronic manual display apparatus
CN1265208A (en) Data communications
CN1758205A (en) Flick gesture
CN1918453A (en) Navigation system and course guiding method
CN101063924A (en) Method and device for making user elect multiple objects in a document
CN1848116A (en) Information processing device, information processing system, and information processing method
CN101046795A (en) Numeral display control device and numeral display control method
CN1173283C (en) Document image processing apparatus and its method and recording medium with all program
JP2014016712A (en) Information processing apparatus, and information processing method and program
CN1841251A (en) Programmable terminal system
JP2017215756A (en) Writing system, information processing device, and program
CN101134638B (en) Substrate pattern input system, pattern forming method and recording medium
CN1656473A (en) Ink collection and rendition
CN1417684A (en) Track information searching device and method
JP5770121B2 (en) Handwriting input device, handwriting input program, handwriting input method
CN1282933C (en) Communication terminal, storing media and communication system
CN1148684C (en) Document information processing device
CN1224885C (en) Electronic mail handling equipment and method thereof, program for carrying out such method and its recording medium
CN101057271A (en) Automatic switching for a dual mode digitizer
WO2012043369A1 (en) Display terminal and control program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant