US20130244539A1 - Interacting toys - Google Patents

Interacting toys Download PDF

Info

Publication number
US20130244539A1
US20130244539A1 US13/639,411 US201113639411A US2013244539A1 US 20130244539 A1 US20130244539 A1 US 20130244539A1 US 201113639411 A US201113639411 A US 201113639411A US 2013244539 A1 US2013244539 A1 US 2013244539A1
Authority
US
United States
Prior art keywords
toy
data
interactions
interaction
avatar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/639,411
Inventor
Steven Lipman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Librae Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Librae Ltd filed Critical Librae Ltd
Assigned to LIBRAE LIMITED reassignment LIBRAE LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIPMAN, STEVEN
Publication of US20130244539A1 publication Critical patent/US20130244539A1/en
Assigned to LIPMAN, STEVEN reassignment LIPMAN, STEVEN ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIBRAE LIMITED
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • This invention relates to toys.
  • this invention relates to toys such as dolls that interact with each other.
  • the memory is adapted to store said data relating to a plurality of interactions.
  • a server comprising: means for communicating with a plurality of toys, means for receiving data related to each said toy; means for processing said data; and means for allocating points to each said toy in dependence on said processed data.
  • the code receiving means is adapted to receive a code via a manual input.
  • FIG. 2 (Prior art) is a schematic illustration of a wireless communications dongle
  • the user can be aided by his/her online friends in order to accumulate points more quickly.
  • the users can create groups (of his/her friends) and compete with other such groups in a group ranking competition, similar to the individual user competition described above.
  • points can be awarded for reaching certain targets of any of the statistics; for example, once the user's doll has had 5, 10, 20, etc interactions in one particular time period (e.g. a week, a month, or a year) a number of points will be awarded. Any other statistic can be used as the basis for awarding points, for example the length of the interactions (i.e. points for each 10 minutes of interaction in a week).
  • object recognition software that enables the website/application to recognise certain objects, such as items of specific clothing, toy sports equipment, etc.
  • the course of the interaction may be altered, for example, the weightings of the next phrases can be altered such that it is more likely that the dolls will communicate about the object that has just been introduced. For example, if a toy jumper is introduced, the weighting for communicating about clothes shopping is increased by an order of magnitude. In this way, the user can influence the interaction between the dolls.
  • each doll has unique identifier (tag or number), and every unit is identifiable as a unit.
  • the user logs on the website, connects the doll to the PC (the doll being recognised using the unique identifier), and the user can then input personal details about the doll; i.e. name, favourite colour, favourite pets, favourite music etc.
  • the user is then able to acquire points on his/her account, associated with the doll, by way of:

Landscapes

  • Toys (AREA)

Abstract

The present invention relates to toys that are enabled to interact with other such toys, and in particular a toy comprising: a processor for generating interactions between such toy and at least one other toy capable of interacting with such toy; an interaction tracking engine for generating data related to the interactions; and a memory connected to said interaction tracking engine for storing said data. The invention further relates to a server comprising: means for communicating with a plurality of toys, means for receiving data related to each said toy; means for processing said data; and means for allocating points to each said toy in dependence on said processed data.

Description

  • This invention relates to toys. In particular, although not exclusively, this invention relates to toys such as dolls that interact with each other.
  • Embedded computers and micro-processors have improved toys for children. They have been used most extensively in educational toys, but have also been used in interactive toys. ActiMates® Barney® is one example of an interactive toy which responds to interaction from a child by appropriate vocalisations, and can sing-a-long to videos.
  • Interaction Tracking
  • According to one aspect of the present invention there is provided a toy comprising: a processor for generating interactions between such toy and at least one other toy capable of interacting with such toy; an interaction tracking engine for generating data related to the interactions; and a memory connected to said interaction tracking engine for storing said data.
  • Preferably, the toy further comprises means for outputting said data.
  • Preferably, the memory is adapted to store said data relating to a plurality of interactions.
  • Preferably, the type of said data is predetermined.
  • Preferably, the data includes a measure of the interactions. More preferably, the measure includes a count related to the interactions and/or the measure is a temporal measure.
  • Preferably, the measure includes at least one of the following: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
  • Preferably, the data includes whether a predetermined interaction, such as a specific phrase and/or word, has been used during an interaction.
  • Preferably, the interaction is an audible interaction (for example speech), and/or a physical interaction.
  • Preferably, the toy further comprises means for analysing said data, wherein said analysing means determines when a predetermined target value, associated with said data, has been reached. More preferably, the predetermined target value is a count related to said data and/or the predetermined target value is a duration.
  • Preferably, the predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
  • Preferably, the toy further comprises means for outputting said analysis. More preferably, the analysis outputting means incorporates a unique identifier, associated with said toy, with said analysis.
  • Preferably, the data outputting means incorporates a unique identifier, associated with said toy, with said data.
  • Preferably, the toy is a computer, and the form of said toy is represented by an avatar on said computer's screen.
  • Preferably, the toy is an individual object.
  • According to a further aspect of the present invention there is provided a server comprising: means for communicating with a plurality of toys, means for receiving data related to each said toy; means for processing said data; and means for allocating points to each said toy in dependence on said processed data.
  • Preferably, the points are stored in memory associated with each respective toy.
  • Preferably, the server further comprises means for comparing the points associated with each toy. More preferably, the comparison means is adapted to generate a ranked list of toys according to the number of points associated with each respective toy. Yet more preferably, the processing means determines when a predetermined target value, associated with said data, has been reached.
  • Preferably, the predetermined target value is a count related to said data and/or is a duration.
  • Preferably, the toy is a toy substantially as herein described.
  • Preferably, the predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
  • Augmented Reality with Intercommunication
  • According to a yet further aspect of the present invention there is provided an augmented reality system, including: a processor; means for receiving a code; an avatar generation engine adapted to generate an avatar in dependence on said code; and means for outputting data representing an image comprising the avatar generated by the avatar generation engine.
  • Preferably, the code receiving means is adapted to receive a code via a manual input.
  • Preferably, the code receiving means is adapted to receive a code via a camera.
  • Preferably, the code is on a toy, and the toy may be a doll or a card.
  • Preferably, the system further comprises means for communicating with a physical toy. Preferably, the communication means includes a wireless adapter.
  • Preferably, the system further comprises means for identifying said toy, and a theme stored within said toy, wherein said avatar and said toy then communicate within said theme. Preferably, the interaction includes, speech and actions.
  • Preferably, the system further comprises: means for receiving image data, said image data representing an image of a physical toy in a physical environment, wherein said outputting means is adapted to output data representing said image comprising the generated avatar together with the image of the physical toy in the physical environment; means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and means for receiving activity data representing an action of the physical toy; wherein the processor is adapted to analyse said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
  • According to a yet further aspect of the present invention there is provided an augmented reality system, comprising: a processor; an avatar generation engine adapted to generate an avatar; means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and means for receiving activity data representing an action of the physical toy; wherein the processor is adapted to analyse said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
  • Preferably, the activity data represents speech and/or a physical action.
  • Apparatus and method features may be interchanged as appropriate, and may be provided independently one of another. Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to apparatus aspects, and vice versa.
  • Embodiments of this invention will now be described, by way of example only, with reference to the accompanying drawings, of which:
  • FIG. 1 (Prior art) is a schematic illustration of a doll;
  • FIG. 2 (Prior art) is a schematic illustration of a wireless communications dongle;
  • FIG. 3 is a schematic illustration of a doll with interactions tracking;
  • FIG. 4 is a diagram of an augmented reality device;
  • The basic features and operation of such interacting toys are known in the art, for example in International Patent Publication No. WO2009/010760; however a brief description is provided below to aid in the understanding of the present invention.
  • The following description relates to a known toy, such as a toy doll, that is enabled to communicate with other such toys; the dolls are adapted to coordinate the speech between the dolls.
  • FIG. 1 shows a schematic representation of the known doll, with the hardware components required to allow the doll to communicate, and perform other such tasks. The doll 100, as shown in FIG. 1, comprises a processor 102 that includes a wireless module 104. The processor is in communication with memory 106, ROM 108, and RAM 110. An IR/RF transmitter/receiver is connected to the processor/wireless module and is enabled to transmit/receive signals to/from other such dolls. The doll is also connected to a loud speaker 114. A USB controller 116 is used to update the memory 106, and also to charge, via the charger circuitry 118, the battery 120.
  • The memory 106 stores information relating to conversations that the dolls can have, and is accessed by the processor when it is compiling speech. The ROM 108 is used to store permanent information relating to the doll, such as the doll's name and ID number. This information is used in the initialisation procedure when setting up a network of dolls. The RAM 110 stores information relating to the current conversation and is used in order to produce more realistic conversation by storing information relating to the phrases already used for example.
  • Each doll 100 contains in memory 106: a data set containing the doll's name, and other variables defined during a conversation; a set of instructions which produces the conversation; and a set of audio data. The variables defined during the conversation are only stored in the controller doll.
  • The dolls are adapted to download theme via a PC from a website, and then converse in that theme with other such dolls.
  • A USB communications dongle is also described in International Patent Publication No. WO2009/010760, that enables a PC to interact wirelessly with a toy. FIG. 2 shows a schematic representation of the USB communications dongle 1600, attached to a PC 304, and in wireless communication with the dolls 100. The dongle contains a wireless module 104, an IR/RF transmitter/receiver 212, and an interface 1602. These components, except the interface 1602, are the same as contained within the doll 100, as described above. However, the PC 304 is utilised as the processor 1604, instead of the dongle having an independent processor as the doll 100 has, and so the PC effectively becomes a virtual doll able to communicate with the physical dolls 100. The virtual doll is provided with an animated avatar shown on the PC monitor, that may be similar in appearance to the real doll, and whereby the animation of the avatar is synchronised with the speech of the doll. In order to run the conversations, the PC has stored in memory 1606 an emulator for emulating the processor of the toy.
  • The website is arranged to allow the user to download various themes, and also to interact with other users. This enables the users to interact both in the virtual world—via chat rooms, games, competitions, or the like—and in the physical world—by playing with other users with the communicating dolls.
  • In preferred embodiments of the present invention, a website, described in further detail below, is provided that enables a user to log in, and register his/her details, such as the type and number of toy dolls he/she has, the name of his/her toy doll(s), etc. The website provides features such as: friendship groups via social network, live chat, downloadable stories (related to the downloadable themes), and communicating characters with real voices. Characters can be named, styled, saved, and posted up to the website so that other users can vote upon them. This allows users to compete with each other for points that are awarded to the users based on the voting, the number of posts to the website, etc.
  • Also, in the physical world, Limited Edition labels are provided in the doll clothes. The Limited Edition labels are provided with a code that can be entered into the website to accumulate points, or to receive special gifts, etc.
  • The website is provided with the functionality to rank users, with the results being updated continually, but with weekly, monthly, and annual competitions. The competitions can be regional, national, and international. This enables an “X Factor” style competition amongst home country users and internationally on a global scale, and allows prizes to be awarded, and for bronze, silver, gold, loyalty card/awards to be provided to the users.
  • The users can accumulate points in dependence on a number of other factors, such as:
      • Creating an account on the website (for example this may provide bonus points)
      • The type of theme that is downloaded in to the doll (for example, downloading a more socially acceptable theme, i.e. health or sport related, provides more points than a less socially acceptable theme)
      • The statistics of the interactions between the physical dolls (see below for further details)
      • Doing well in quizzes
      • Playing games, including top trump style card games, on-line games, etc
      • Finding special cards within physical card packs bought by the user
      • Hearing the “Golden Phrase” of the week (see below for further details)
  • Further, during the challenges, quizzes, games etc, the user can be aided by his/her online friends in order to accumulate points more quickly. The users can create groups (of his/her friends) and compete with other such groups in a group ranking competition, similar to the individual user competition described above.
  • Interaction Tracking
  • FIG. 3 shows a schematic diagram of a toy doll. The doll 200 comprises similar components to the prior art doll 100, but further comprises an interaction tracking engine 202 that includes additional memory 204. The interaction tracking engine is connected to the processor 102.
  • As with doll 100, doll 200 is adapted to download themes via a PC from a website, and then converse in that theme with other such dolls. The conversations are constructed in a similar way to that described in International Patent Publication No. WO2009/010760, which is hereby incorporated by reference; see in particular Page 12 line 28 to Page 18 line 2.
  • During the interactions between the dolls, the interaction tracking engine is utilised to track the interactions and store in memory 204 the statistics relating to those interactions. The statistics that are stored include, but are not limited to, any some or all of the following measures:
      • the total number of separate interactions between the doll and any other doll, or dolls (for example, the interaction could be between a group of more than two dolls);
      • the total number of separate interactions between the doll, and each other specific doll (i.e. the number of separate interactions between Doll A and Doll B, between Doll A and Doll C, etc);
      • the total number of incidences of the doll using a particular word or phrase (i.e. the number of times Doll A says “I have a new dress!”);
      • the total time the doll has participated in interactions;
      • the total time the doll has participated in interactions since the last time the doll was connected to a PC and the website;
      • whether or not a specific phrase has been used during an interaction; and
      • the time of day that each interaction occurred
  • In the above, an interaction with a PC avatar doll is counted as if it were an interaction with a physical doll.
  • The interaction tracking engine monitors the interactions between dolls utilising doll identifiers, specific to each type of doll (for example, all type A dolls have the same identifier), or alternatively the identifiers are specific to each individual doll. The interaction tracking engine is adapted to create a database, stored in memory 204, that lists the interactions between the doll and other dolls using the doll identifiers. Further, since each phrase, or word, has an identifier, any of the phrases, or words, can be tracked in the database in the same way as described above.
  • Using the phrase/word tracking ability the “Golden Phrase” is tracked to determine if/when the phrase is used by the doll. By connecting the doll to a PC, logging into the website, and then downloading the latest theme, the user is provided with the “Golden Phrase” within the theme. On the website the “Golden Phrase” is announced, and when the user hears the “Golden Phrase” during an interaction between a group of dolls, the event of the doll using the “Golden Phrase” is tracked and stored in the interaction tracking database. Thus, when the doll is once again logged in to the website, the website verifies that the “Golden Phrase” has been used and awards points, or a prize, to the user.
  • As is known, an interaction is constructed by allocating weightings to each possible response at any point in the interaction. The “Golden Phrase” is generally provided with a low weighting (i.e. the probability that the “Golden Phrase” is used during an interaction is relatively low), and thus the user may be required to initiate interaction in any one theme a number of times before the “Golden Phrase” is used; this ensures that the life span of any one theme is increased, and the users gain reward points in the process.
  • Likewise, points can be awarded for reaching certain targets of any of the statistics; for example, once the user's doll has had 5, 10, 20, etc interactions in one particular time period (e.g. a week, a month, or a year) a number of points will be awarded. Any other statistic can be used as the basis for awarding points, for example the length of the interactions (i.e. points for each 10 minutes of interaction in a week).
  • FIG. 4 shows a schematic diagram of the back-end server 400 that communicates with the doll 200 via the user's PC 402 and the website 404; the back-end server facilitates the operation of the website and the awarding and storing of points as described above. The Doll's unique ID is verified by checking the Doll ID memory 406 located in the server. Likewise, when the user logs into the website the user's ID is verified by checking the User ID memory 408. All of the data transmitted from the Doll to the server, or from the server to the doll 200/user 410, is passed through the data interface 412.
  • The server processor 414 processes the data received via the data interface, and determines the number of reward points that are to be assigned to the user 410 based on the interaction data downloaded from the doll using a rewards engine 416. Alternatively, the doll is adapted to determine the number of reward points that are due by pre-processing the data accumulated by the interaction tracking engine before transmitting the data to the server. In this way, the amount of data transferred between the doll and the server can be reduced.
  • As discussed above, rewards can be provided to the user once certain goals have been achieved. Hence, the reward engine 416 accesses the data relating to the number of points the user 410 has stored in the memory 418 within the points accumulator 420. Once a goal number of points has been achieved a reward is provided to the user. Since the memory 418 stores data for all users the reward engine can provide listings of the users with the most points, thus enabling the weekly, monthly, and yearly rewards to be given to the appropriate users. In addition, the reward engine is adapted to provide the doll with the “Golden Phrase”; details of the “Golden Phrase” are described above. The creator of the server end data, i.e. a webmaster or the like, inputs the current “Golden Phrase” by referencing a word/phrase using the word/phrase ID number already used within the themes. Alternatively, a special word/phrase can be incorporated into the theme during its generation for use as the “Golden Phrase”.
  • FIG. 5 shows the format of the data 500 transmitted from the doll to the server. As can be seen the data comprises the information described above that is tracked by the interaction tracking engine.
  • Augmented Reality
  • As shown in FIG. 6, there is also provided means for generating an augmented reality. A camera 600, such as a webcam, connected to the user's PC 602 is used to obtain a real-time video stream of a play area. The augmented reality is initiated when a user introduces an object 604, with an imprinted code 606, into the play area 609. The functionality may be provided through the website 404, or via a stand-alone application on the user's PC 602.
  • By providing the ability to recognise codes or the like using the camera connected to the user's PC the online website/application provides additional functionality to the user. The website/application recognises the code using the code recognition engine 610 (the code may or may not be visible to the user) and then interprets that code into an online image, or text using the avatar generation engine; for example, the code could represent the doll that the code is imprinted on, and then an avatar 608 representing that doll is generated and shown on the user's PC screen 612. The avatar and the real doll can then communicate, via the USB communications dongle described above, as if the avatar were a real doll. Hence, the real doll and the virtual doll in the form of an avatar can both appear on the user's PC screen.
  • Alternatively, in the absence of a camera 600, the code can be inputted into the PC manually. In a similar way to that described above, the code recognition engine then interprets the code and the avatar generation engine, generates an avatar 608. However, only the avatar is shown on the PC screen, but the physical doll and the avatar can still interact via the communications dongle.
  • The user is required to log on to the website before the augmented reality can be generated, and so the user's details are known, i.e. what dolls they own; this is accomplished using both the user and the doll ID's. Therefore, when the interaction is generated it is known what theme the physical doll currently has within memory. Thus the correct theme can be used by the avatar. This can also be accomplished by identifying the doll, connected wirelessly to the PC via the communications dongle, using the doll's unique identifier, and so a user's friend's doll can also interact with an avatar at the same time as the user's doll.
  • Further, the means for providing augmented reality can recognise a plurality of objects, and produce a plurality of avatars on the user's PC screen. These avatars then communicate with each other, and/or a physical doll.
  • Alternatively, the code 608 is used to provide a reference to a special theme that is then automatically downloaded to the doll, and then the doll and the avatar communicate within that theme.
  • In addition, there is provided object recognition software that enables the website/application to recognise certain objects, such as items of specific clothing, toy sports equipment, etc. By introducing the object, the course of the interaction may be altered, for example, the weightings of the next phrases can be altered such that it is more likely that the dolls will communicate about the object that has just been introduced. For example, if a toy jumper is introduced, the weighting for communicating about clothes shopping is increased by an order of magnitude. In this way, the user can influence the interaction between the dolls.
  • In an alternative, the augmented reality system can recognise a code for, for example, a pregnant doll, and will then spawn an avatar in the form of a baby. This is expanded such that the code can represent any virtual object. For example, the code could represent a new pair of shoes that are automatically shown on the real doll's feet on the user's PC screen. This is put into effect using the object recognition software described above, since the doll can be recognised, and hence the image of the shoes can be placed correctly on the screen to make it appear as though the shoes are actually on the doll.
  • When the user introduces the object, a card or the like, into the play area points are given to the user's account. Depending on the type of object a different number of points are given. The cards or the like can be obtained from the website itself, or from retail shops.
  • In summary, each doll has unique identifier (tag or number), and every unit is identifiable as a unit. The user logs on the website, connects the doll to the PC (the doll being recognised using the unique identifier), and the user can then input personal details about the doll; i.e. name, favourite colour, favourite pets, favourite music etc. The user is then able to acquire points on his/her account, associated with the doll, by way of:
      • connecting the doll to the website (i.e. every time you connect you get a single point);
      • based on the statistical parameters discussed above;
      • collecting golden key phrases
      • playing games
  • By accumulating points through playing with the physical doll the user is competing with other users and weekly, monthly, and annually winners are announced who receive prizes.
  • It is of course to be understood that the invention is not intended to be restricted to the details of the above embodiments which are described by way of example only, and modifications of detail can be made within the scope of the invention.
  • Each feature disclosed in the description, and (where appropriate) the claims and drawings may be provided independently or in any appropriate combination.

Claims (44)

What is claimed is:
1. A toy comprising:
a processor for generating interactions between such toy and at least one other toy capable of interacting with such toy;
an interaction tracking engine for generating data related to the interactions, wherein said data includes a measure of the interactions, such as a count related to said interactions and/or a temporal measure; and
a memory connected to said interaction tracking engine for storing said data.
2. The toy according to claim 1 further comprising means for outputting said data, preferably to a server.
3. The toy according to claim 1, wherein said memory is adapted to store said data relating to a plurality of interactions.
4. The toy according to claim 1, wherein the type of said data is predetermined.
5. The toy according to claim 1, wherein said data is statistical data.
6. The toy according to claim 5, wherein said measure includes a count related to said interactions.
7. The toy according to claim 5, wherein said measure is a temporal measure.
8. The toy according to claim 5, wherein said measure includes at least one of the following: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction, and the total time the toy has participated in interactions.
9. The toy according to claim 1, wherein said data includes whether a predetermined interaction, such as a specific phrase and/or word, has been used during an interaction.
10. The toy according to claim 1, wherein said interaction is an audible interaction.
11. The toy according to claim 1, wherein said interaction is a physical interaction.
12. The toy according to claim 1, further comprising means for analyzing said data, wherein said analyzing means determines when a predetermined target value, associated with said data, has been reached.
13. The toy according to claim 12, wherein said predetermined target value is a count related to said data.
14. The toy according to claim 12, wherein said predetermined target value is a duration.
15. The toy according to claim 12, wherein said predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
16. The toy according to claim 12, further comprising means for outputting said analysis.
17. The toy according to claim 16, wherein said analysis outputting means incorporates a unique identifier, associated with said toy, with said analysis.
18. The toy according to claim 2, wherein said data outputting means incorporates a unique identifier, associated with said toy, with said data.
19. The toy according to claim 1, wherein said toy is a computer, and the form of said toy is represented by an avatar on said computer's screen.
20. The toy according to claim 1, wherein said toy is an individual object.
21. A server comprising:
means for communicating with a plurality of toys,
means for receiving data related to each said toy;
means for processing said data; and
means for allocating points to each said toy in dependence on said processed data.
22. The server according to claim 21, wherein said points are stored in memory associated with each respective toy.
23. The server according to claim 21, further comprising means for comparing the points associated with each toy.
24. The server according to claim 23, wherein said comparison means is adapted to generate a ranked list of toys according to the number of points associated with each respective toy.
25. The server according to claim 24, wherein said processing means determines when a predetermined target value, associated with said data, has been reached.
26. The server according to claim 25, wherein said predetermined target value is a count related to said data.
27. The server according to claim 25, wherein said predetermined target value is a duration.
28. The server according to claim 21, wherein each said toy is a toy comprising:
a processor for generating interactions between such to and at least one other to capable of interacting with such toy;
an interaction tracking engine for generating data related to the interactions; and
a memory connected to said interaction tracking engine for storing said data.
29. The server according to claim 28, wherein said predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
30. The toy according to claim 1, further comprising a server including:
means for communicating with a plurality of toys,
means for receiving data related to each said toy;
means for processing said data; and
means for allocating points to each said toy in dependence on said processed data.
31. An augmented reality system, including:
a processor;
means for receiving a code;
an avatar generation engine adapted to generate an avatar in dependence on said code;
means for outputting data representing an image comprising the avatar generated by the avatar generation engine; and
means for communicating with a physical toy.
32. The system according to claim 31, wherein said code receiving means is adapted to receive a code via a manual input.
33. The system according to claim 31, wherein said code receiving means is adapted to receive a code via a camera.
34. The system according to any of claim 31, wherein said code is on a toy.
35. The system according to claim 34, wherein said toy is a doll or a card.
36. The system according to claim 31, wherein said communication means includes a wireless adapter.
37. The system according to claim 36, further comprising means for identifying said toy, and a theme stored within said toy, wherein said avatar and said toy then communicate within said theme.
38. The system according to claim 37, wherein said communication includes, speech and actions.
39. An augmented reality system as claimed in claim 31, further comprising:
means for receiving image data, said image data representing an image of a physical toy in a physical environment, wherein said outputting means is adapted to output data representing said image comprising the generated avatar together with the image of the physical toy in the physical environment;
means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and
means for receiving activity data representing an action of the physical toy;
wherein the processor is adapted to analyze said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
40. An augmented reality system, comprising:
a processor;
an avatar generation engine adapted to generate an avatar;
means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and
means for receiving activity data representing an action of the physical toy;
wherein the processor is adapted to analyze said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
41. The augmented reality system as claimed in claim 40, wherein said activity data represents speech.
42. The augmented reality system as claimed in claim 40, wherein said activity data represents a physical action.
43. (canceled)
44. (canceled)
US13/639,411 2010-04-06 2011-04-06 Interacting toys Abandoned US20130244539A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GB1005718.0 2010-04-06
GBGB1005718.0A GB201005718D0 (en) 2010-04-06 2010-04-06 Interacting toys
PCT/GB2011/050684 WO2011124916A1 (en) 2010-04-06 2011-04-06 Interacting toys

Publications (1)

Publication Number Publication Date
US20130244539A1 true US20130244539A1 (en) 2013-09-19

Family

ID=42228919

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/639,411 Abandoned US20130244539A1 (en) 2010-04-06 2011-04-06 Interacting toys

Country Status (6)

Country Link
US (1) US20130244539A1 (en)
EP (1) EP2555840A1 (en)
JP (1) JP5945266B2 (en)
CN (1) CN103201000A (en)
GB (1) GB201005718D0 (en)
WO (1) WO2011124916A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130078886A1 (en) * 2011-09-28 2013-03-28 Helena Wisniewski Interactive Toy with Object Recognition

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0714148D0 (en) 2007-07-19 2007-08-29 Lipman Steven interacting toys
GB2507073B (en) 2012-10-17 2017-02-01 China Ind Ltd Interactive toy
CN105278477A (en) * 2014-06-19 2016-01-27 摩豆科技有限公司 Method and device for operating interactive doll
TWI559966B (en) * 2014-11-04 2016-12-01 Mooredoll Inc Method and device of community interaction with toy as the center
GB2550911B (en) * 2016-05-27 2021-02-10 Swap Bots Ltd Augmented reality toy
CN106552421A (en) * 2016-12-12 2017-04-05 天津知音网络科技有限公司 AR o systems
JP7331349B2 (en) * 2018-02-13 2023-08-23 カシオ計算機株式会社 Conversation output system, server, conversation output method and program
CN114053732B (en) * 2022-01-14 2022-04-12 北京优艾互动科技有限公司 Doll linkage method and system based on data processing

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030003839A1 (en) * 2001-06-19 2003-01-02 Winbond Electronic Corp., Intercommunicating toy
US20050059483A1 (en) * 2003-07-02 2005-03-17 Borge Michael D. Interactive action figures for gaming schemes
US20080153594A1 (en) * 2005-10-21 2008-06-26 Zheng Yu Brian Interactive Toy System and Methods

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4368962B2 (en) * 1999-02-03 2009-11-18 株式会社カプコン Electronic toys
CN1161700C (en) * 1999-04-30 2004-08-11 索尼公司 Electronic pet system, network system, robot and storage medium
JP2001318872A (en) * 2000-05-10 2001-11-16 Nec Corp Communication system and communication method
JP2003039361A (en) * 2001-07-24 2003-02-13 Namco Ltd Information providing system, robot, program, and information storage medium
JP2005211232A (en) * 2004-01-28 2005-08-11 Victor Co Of Japan Ltd Communication support apparatus
GB2425490A (en) * 2005-04-26 2006-11-01 Steven Lipman Wireless communication toy
JP2009000472A (en) * 2007-06-22 2009-01-08 Wise Media Technology Inc Radio tag and growth toy device by network communication
GB0714148D0 (en) * 2007-07-19 2007-08-29 Lipman Steven interacting toys
JP4677593B2 (en) * 2007-08-29 2011-04-27 株式会社国際電気通信基礎技術研究所 Communication robot
US8545335B2 (en) * 2007-09-14 2013-10-01 Tool, Inc. Toy with memory and USB ports
RU2011116297A (en) * 2008-10-06 2012-11-20 Вердженс Энтертейнмент ЭлЭлСи (US) SYSTEM FOR MUSICAL INTERACTION OF AVATARS

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030003839A1 (en) * 2001-06-19 2003-01-02 Winbond Electronic Corp., Intercommunicating toy
US20050059483A1 (en) * 2003-07-02 2005-03-17 Borge Michael D. Interactive action figures for gaming schemes
US20080153594A1 (en) * 2005-10-21 2008-06-26 Zheng Yu Brian Interactive Toy System and Methods

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130078886A1 (en) * 2011-09-28 2013-03-28 Helena Wisniewski Interactive Toy with Object Recognition

Also Published As

Publication number Publication date
GB201005718D0 (en) 2010-05-19
WO2011124916A1 (en) 2011-10-13
EP2555840A1 (en) 2013-02-13
CN103201000A (en) 2013-07-10
WO2011124916A4 (en) 2011-12-08
JP5945266B2 (en) 2016-07-05
JP2013523304A (en) 2013-06-17

Similar Documents

Publication Publication Date Title
US20130244539A1 (en) Interacting toys
US8332544B1 (en) Systems, methods, and devices for assisting play
US10229608B2 (en) Wireless communication between physical figures to evidence real-world activity and facilitate development in real and virtual spaces
WO2018145527A1 (en) Cross-platform interaction method and device, program, and medium
WO2015063610A2 (en) Computer systems and computer-implemented methods for conducting and playing personalized games based on vocal and non-vocal game entries
WO2019134462A1 (en) Implementation method and apparatus for multi-people somatosensory dance, and electronic device and storage medium
CN111274151B (en) Game testing method, related device and storage medium
JP2013523304A5 (en)
WO2019124059A1 (en) Game device, game method and recording medium
CN111936213A (en) Generating Meta-Game resources with social engagement
US20210205715A1 (en) Contextual ads for esports fans
JP7194509B2 (en) Server system, game system and matching processing method
US20230001300A1 (en) Computer system, game system, and control method of computer system
WO2022003938A1 (en) Information processing system, information processing method, computer program, and device
CA3051053C (en) Physical element linked computer gaming methods and systems
Park et al. The interplay between real money trade and narrative structure in massively multiplayer online role-playing games
Bischke et al. Biofeedback implementation in a video game environment
Machado Aplpication developmente over IoT platform Thingworx
KR101781179B1 (en) Method for game service and apparatus executing the method
KR101285743B1 (en) Method and server for controlling game simulation
KR20130114314A (en) Method and apparatus for indicating engagement information of each user
WO2023276965A1 (en) System, method, and program for supporting betting in competition
Pnevmatikakis et al. Game and multisensory driven ecosystem to an active lifestyle
Eklund et al. Geofort: Mobile game for motivating physical activity using gamification and augmented reality
KR20230155965A (en) System for processing information using collective intelligence and method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: LIBRAE LIMITED, ISLE OF MAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LIPMAN, STEVEN;REEL/FRAME:030499/0049

Effective date: 20130520

AS Assignment

Owner name: LIPMAN, STEVEN, UNITED KINGDOM

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LIBRAE LIMITED;REEL/FRAME:041658/0045

Effective date: 20170306

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE