US20060226298A1 - Graphical method and system for model vehicle and accessory control - Google Patents
Graphical method and system for model vehicle and accessory control Download PDFInfo
- Publication number
- US20060226298A1 US20060226298A1 US11/096,841 US9684105A US2006226298A1 US 20060226298 A1 US20060226298 A1 US 20060226298A1 US 9684105 A US9684105 A US 9684105A US 2006226298 A1 US2006226298 A1 US 2006226298A1
- Authority
- US
- United States
- Prior art keywords
- model
- graphical
- control
- display screen
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
- A63H30/04—Electrical arrangements using wireless transmission
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H19/00—Model railways
- A63H19/24—Electric toy railways; Systems therefor
Definitions
- the present invention relates to a method and system for remote control of a model vehicle, such as a model train.
- a control unit having a user interface.
- the user interface may include one or more keys, buttons, levers, knobs or the like, for obtaining user input used in control of the model vehicle.
- a control unit for a model train may include a dial that can be turned for speed control, and numerous buttons for functions such as on/off, train sounds, smoke generators, lights, model track switches and accessories.
- Control units may also include a display screen for display of text messages, and one or more lights, sound generators, or other system state indicators.
- Control units for model trains often comprise small hand-held devices, for the convenience of the user. Stationary control interfaces may also be known.
- Remote control interfaces for model trains may communicate with model trains and accessories using various methods.
- One approach is to provide control signals from a remote control unit via a wired or wireless connection to a base station, which is connected to a model railway track.
- the base station may translate received control signals into one or more suitable command protocols for a targeted model train or accessory. For example, it is known to send DC offset signals superimposed with AC track power through an electrified rail of the model railway. Model trains or accessories receiving power through the rail may then receive the DC offset signals, decode them using an onboard controller, and respond appropriately.
- Another approach is to transmit a radio-frequency (RF) signal through a rail of the model track, essentially using the model track as a near-field antenna for an RF signal that fades rapidly with distance from the track.
- RF radio-frequency
- the signal used is a 455 Khz frequency shift keyed (FSK) signal at 5 volts peak-peak. This signal creates a field along the length of the track, detectable within a few inches of the track.
- Trains or accessories are equipped with a receiver for the RF signal, and a microprocessor that implements commands addressed to a particular train car or track accessory.
- Lionel manufactures a system of remote controllers and compatible modular receiver and control units referred to as a “CAB-1” remote controllers and modules, respectively.
- CAB-1 modules may be added to Lionel trains and accessories via a standard connector socket, to enable control of a train or accessory using FSK signaling and a CAB-1 remote controller.
- an RF or infrared (IR) signal may be broadcast directly to a train or accessory through the air from a base station or remote control unit, received by a receiver on the train or accessory, and processed by a control unit.
- IR infrared
- the present invention provides a control unit and interface that may be used to control multiple model vehicles and accessories via a single interface, without requiring the user to enter any keystroke commands or to memorize control sequences.
- the control unit may be used with multiple different control methods, including but limited to DC offset control, RF control, including but not limited to FSK control, and IR control.
- the control unit may be used to control model vehicles and accessories using different control systems. Accordingly, a control unit according to the invention may be used by a hobbyist to control a variety of different types of trains and accessories, or other model vehicles, using a single interface.
- control unit comprises a graphical display screen, a processor executing graphical user interface (GUI) software operative to display graphical images on the display screen, a pointing device operative to provide user input to the processor, and a model vehicle control output.
- GUI graphical user interface
- a pointing device may comprise, for example, a touchscreen responsive to finger or stylus touches, a mouse, trackball, touchpad, joystick button, and so forth.
- the control unit may be implemented using a special-purpose computer, such as an integrated controller, display screen, pointing device, and control output of a hand-held (or desktop) control unit.
- control unit may be implemented using standard displays, pointing devices, and outputs of a general purpose computer, using the computer's microprocessor to execute suitably configured control software.
- control software may be implemented using standard displays, pointing devices, and outputs of a general purpose computer, using the computer's microprocessor to execute suitably configured control software.
- One of ordinary skill may implement the novel user interface and control system as disclosed herein in other ways, as well.
- the invention comprises software executed on a processor of the control unit.
- the software is operative to cause graphical displays to appear on the display screen, and generates appropriate commands for controlling trains and accessories in response to input from the pointing device.
- the software may cause an image or graphical representation of a model train to be controlled to appear on the display screen.
- a user may perform actions such as touching a wheel, touching a light, smokestack, tender, or other part of the train, clicking and dragging the train forwards or backwards, and so forth.
- the controller receives the pointing device input which is interpreted by the software, such as by using a lookup table or object-oriented program structure, as correlating to specific desired commands.
- the software may then cause an appropriate command to be transmitted to the model train or accessory using a selected transmission path.
- the software may be configured such that actions of the pointing device correlate intuitively to commands or to actions of trains or accessories, and the display correlates intuitively to the unit under control. For example, clicking and dragging an image of a train forwards or backwards may cause the selected train to move in the indicated direction. A commanded velocity may correlate to a speed or direction of cursor movement, or to a stylus pressure. Selecting images of a component may cause the corresponding component of the model train or accessory to operate. Selecting the component again may cause it to toggle off or on, depending on its last state. Images may be modified to indicate the state of the train or accessory. For example, images of a turning wheel may indicate motion of a train. Similarly, the state of smoke emission devices, lights, switches, and other accessories may be indicated using a image or icon.
- a control unit may be configured to emulate any number of legacy controllers, for users who prefer using a legacy control interface.
- a display screen of a control unit may be provided with a piezoelectric layer disposed to respond to finger touches.
- the piezoelectric layer may be configured to vibrate and thereby provide auditory or tactile feedback in response to finger or stylus pressure.
- the feel or sound of actions such as button presses, slider or dial movements, or button taps may be simulated.
- Vehicle sounds, movement, and vibration may also be simulated using the piezoelectric layer.
- FIG. 1A is a block diagram showing an exemplary system according to the invention.
- FIG. 1B is a block diagram showing an exemplary handheld control unit in a system including legacy control units.
- FIG. 2 is a break-away perspective view showing a handheld control unit according to the invention.
- FIGS. 3-10 are diagrams showing various exemplary graphical displays for a control interface according to the invention.
- FIG. 11 is a flow chart showing exemplary steps of a method according to the invention.
- the present invention provides a method and system for model vehicle and accessory control, that overcomes the limitations of the prior art.
- like element numerals will be used to indicate like elements appearing in one or more of the figures.
- FIG. 1A shows a system 101 for controlling a model vehicle or accessory, comprising a control unit 100 in communication with a model vehicle 120 , switch accessory 122 , and decorative accessory 124 .
- the control unit 100 may be configured as a handheld device.
- the control unit may be configured as a desktop or layout-mounted device. It should also be possible to implement the control using a general purpose computer, which may reduce the need for additional hardware or provide a larger display or more powerful processing engine than might otherwise be possible. It is therefore anticipated that some users may implement the control unit 100 in more than one way, to take advantage of the characteristics of different hardware platforms.
- Control unit 100 may comprise any suitable processor 102 as known in the art. Various standard semiconductor devices for digital processing, and circuits incorporating such devices, are known in the art.
- the processor 102 should be operatively associated with a memory 108 for holding data and program instructions to be executed by processor 102 .
- Processor 102 may incorporate various components such as bus controllers, graphic controllers, and other auxiliary components as known in the art. From the description herein, one of ordinary skill should be able to suitably configure processor 102 , including providing it with software or other programming instructions using any suitable computer language. Suitable languages may include, for example, C, C++, Visual Basic, Delphi, Python, Java, assembly language, machine language, or combinations of the foregoing with each other or with other languages. Libraries and routines for handling video displays and various input devices, including input from pointing devices, are generally available for these and other programming languages.
- the control unit further comprises a video display device 104 operatively associated with the processor 102 .
- video display devices include, for example, computer monitors configured to accept analog signals such as RGB composite, YPbPr component, or S-video, digital monitors and televisions configured for using DVI, HDMI, or other digital signal input, and televisions such as use a NTSC, PAL, SECAM, or High Defintion television signal as input.
- Digital monitors and televisions often include an adaptor for accepting various analog audio-video (A/V) signals, as well.
- the display screen itself may comprise a cathode ray tube, liquid-crystal flat panel (LCD), gas plasma panel, or similar device. In an embodiment of the invention, a handheld-size LCD flat panel may be used.
- the display device 104 may also comprise electronics for driving a display screen using one or more types of video input signals, and for providing audio output, as known in the art.
- a graphical pointing device 106 should also be communicatively connected to processor 102 , to provide an indication of position, movement, pressure, or other pointing characteristics.
- Pointing device 106 may comprise any suitable hardware or software for sensing position, motion, or touch pressure of a physical object. Such sensors may provide sensor input to a device driver, which provides an interface between the sensor and an application running on processor 102 .
- Pointing devices may provide an indication of two-dimensional position or movement of an object, for example, a mechanical ball, optical sensor, human fingertip, stylus tip, or eyeball. In some cases, three-dimensional information, touch pressure, and rotational movement information may be provided as well.
- touchpads and touchscreens comprise two-dimensional membranes that react to the touch of a finger or stylus tip and provide a signal indicative of position or movement.
- a touchscreen may be comprised as a transparent or translucent touchpad membrane disposed over a video display.
- Processor 102 should be configured to receive input from pointing device 106 , and generate commands for controlling model vehicle 120 , switch accessory 122 , or decorative accessory 124 of layout 114 in response to the pointing input.
- Software for the processor may be configured to provide commands according to separate command protocols, or a single command protocol.
- Various suitable command protocols for model vehicles and accessories are known in the art. For example, the Lionel TrainMasterTM Command Control makes use of a defined command set and protocol, as do other methods for remote control of model vehicles.
- the novel method as disclosed herein for generating commands for a model vehicle based substantially on input from a pointing device may require developing new systems for correlating pointer actions in synchrony with a video display. For example, “grabbing” an image of a model vehicle wheel on a video display and “dragging” it forward may be correlated with a known command for “start forward motion,” and so forth.
- the invention is not limited to any particular set or system of correlations between pointer actions and vehicle or accessory commands.
- software for correlating pointer actions to commands or other output is known from other contexts in computer programming, and may be readily adapted by one of ordinary skill to practice the invention.
- Control unit 100 may further include a transmitter 110 for transmitting a wireless signal to model objects 120 , 122 , 124 or to command units 126 , 128 .
- the control unit may be hardwired to a remote interface module or transmitter, or may be directly connected to model objects or to the model track of layout 114 .
- the control unit may also comprise a receiver for receiving information from the model objects.
- a receiver may be located with transmitter 110 or in a separate component.
- Transmitter 110 may be configured to transmit and receive information with the model objects via one or more transmission paths. For example, information and commands may be transmitted by over-the-air broadcast to a receiver located on the model objects 120 , 122 , 124 .
- a suitable receiver may comprise, for example, a receiver and controller in a modular unit that may be connected via a removable connector to the power and operational devices of the model object.
- TMCC TrainMasterTM compliant
- Lionel provides CAB-1 receiver/controller modules.
- the CAB-1 receiver is configured to receive FSK signals emitted from the model track and is intended for use with TMCC-1 vehicles and accesories.
- CAB-1 modules may be adapted by one of ordinary skill to receive RF or IR over-the-air broadcast signals.
- a suitably adapted module for receiving RF transmissions may be referred to, for example, as a “CAB-2” module.
- CAB-2 modules may be provided with a common interface with CAB-1 modules.
- Model vehicles and accessories may be upgraded from TMCC-1 to TMCC-2 merely by replacing a CAB-1 receiver module with a CAB-2 module.
- transmitter 110 and other components of the system may be configured to support CAB-2 transmission standards while providing backward compatibility to CAB-1 or other standards. The hobbyist may thereby use control unit 100 to control model vehicles and accessories of different types on the same layout.
- control unit 110 may communicate with model objects via an FSK command interface unit 126 connected to the model layout.
- command interface 128 for any other command protocol, such as DC-offset signaling.
- Command interface units 126 , 128 may be configured to receive transmissions from the control unit 100 and to generate commands in any desired protocol for model objects in layout 114 capable of receiving and operating accordingly.
- Command units as known in the art for FSK signaling, DC-offset control, or other protocols may be adapted by one of ordinary skill to receive and process information from the control unit 100 .
- the control unit 100 may be configured to transmit information in a manner already compatible with prior-art command units.
- a single control unit 100 may control model objects using different protocols. Accordingly, a suitably configured control unit may be used to replace numerous otherwise incompatible prior-art control units for controlling any desired number of different types of model objects on layout 114 .
- layout 114 may comprise separate control blocks 116 and 118 of model track, to which one or more legacy command units 128 may be separately connected. Power may be supplied through a conventional power supply 130 .
- Processor 102 may also be configured to communicate via a wireless router 140 or other network device through a network 142 to a remote server 144 .
- This connectivity may be used, for example, to update operating software in memory 108 as improvements are made, or as new models of model vehicles and accessories are released.
- the control unit may be configured to automatically check, download, or install software updates at periodic intervals, or in response to changes in system 101 . In the alternative, or in addition, a user may manually initiate any of these update processes.
- Information may also be transmitted from the control unit 100 to the server 144 , for example, current version information or system state, or an acknowledgment that an update has been successfully installed may be transmitted.
- FIG. 1B shows an exemplary TMCC-2 handheld control unit 100 according to the invention, in a control system together with a TMCC1 controller 150 .
- Controller 150 transmits control signals wirelessly to an FSK base 152 .
- Base 150 may be configured as a TMCC-1 base, which receives TMCC-1 command signals, translates or transforms the signals to an FSK-compatible format 160 and broadcasts the signals from a rail of a layout 114 .
- Vehicle 120 being equipped with a CAB-1 receiver (not shown), receives the FSK control signals via near field 166 , and operates accordingly.
- the TMCC handheld control unit 150 also wirelessly transmits control signals to a TMCC-1 power control unit 154 , which modulates AC phase-controlled power 162 to layout 114 in response to the control signals, for speed control of the vehicle 120 .
- handheld control unit 150 may transmit control signals to a TMCC-2 power control unit 156 .
- Power control unit 156 may be backwardly compatible with the TMCC-1 command set, while providing additional TMCC-2 capabilities as described below for use with TMCC-2 handheld controller 100 .
- TMCC-2 controller 100 may be used instead of, or in addition to, controller 150 to control model vehicles in the same layout 114 .
- Controller 100 may be configured for backward compatibility with the TMCC-1 standard, and thus, may transmit TMCC-1 control signals to base 152 , which then may operate exactly as described in connection with controller 150 .
- controller 100 may transmit a TMCC-1 signal to power control unit 154 , for control of AC power in the same manner as used with TMCC-1 controller 150 .
- control unit 100 may wirelessly communicate directly with the so-equipped vehicle, as described in connection with the preceding FIG. 1A .
- Communication with the CAB-2 transmitter may be two-way, for example commands may be transmitted to vehicle 120 , and data or acknowledgements transmitted from the vehicle 120 to controller 100 .
- Control unit 100 may also be used to transmit and receive from a TMCC-2 power control unit 156 for control of AC phase-controlled power 164 to layout 114 .
- Power unit 156 may comprise a transmitter for transmitting data back to handheld unit 100 .
- data may include, for example, voltage, current, or phase information measured at one or more connection points to a power supply rail of track layout 114 .
- a CAB-2 equipped vehicle may transmit data regarding track power received by the vehicle to the power control unit 156 , which may relay it to unit 100 .
- power control unit 156 may include a processor configured to implement a feedback control loop for power control, such as a PI or PID control loop, to more accurately control power supplied to the track 114 or vehicle 120 . Power information may also be relayed or transmitted directly to remote unit 100 for diagnostic or control purposes.
- a feedback control loop for power control such as a PI or PID control loop
- FIGS. 1 A-B illustrate embodiments of control units employing multiple control protocols and capabilities
- the invention is not limited to the specific protocols and control capabilities disclosed.
- An important aspect of the invention concerns the use of a graphical user interface (GUI) in a control system or method. This aspect of the invention may be practiced independently from, or in conjunction with, the multiple control protocols and capabilities exemplified above. Further details pertaining to use of a graphical user interface for control are described in the specification below.
- GUI graphical user interface
- FIG. 2 shows an exemplary physical package for a handheld graphical control unit 200 .
- Control unit 200 comprises a prominent video display screen 203 for display of graphic images.
- the control unit comprises a substantially keyless interface, relying on the display screen and a pointing device as the primary control interface.
- the display screen may comprise a flat panel LCD 206 as known in the art, or other suitable display.
- a substantially transparent or translucent touchscreen membrane sensor element 204 may overlay the LCD 206 and be configured as a pointing device, as known in the art.
- Suitable touchscreen membranes may comprise, for example, capacitive touchscreens or resistive touchscreens, or more advanced technologies such as IR scanning or surface area wave (SAW) touchscreens.
- the membrane element may be connected to a processor (e.g., processor 100 shown in FIG. 1A ) via a universal serial bus (USB), serial, or other suitable connection via a suitable interface device, as known in the art.
- USB universal serial bus
- the control unit 200 may also comprise one or more electronics boards 208 for holding system electronics, such as processors, transmitter/receivers, display and pointer interfaces, memory devices, batteries, and so forth.
- system electronics such as processors, transmitter/receivers, display and pointer interfaces, memory devices, batteries, and so forth.
- the arrangement of internal components may be accomplished in any suitable manner by one of ordinary skill. While the depicted embodiment suggests a general anticipated arrangement of elements, one or ordinary skill should understand that it is not intended to depict actual details of a mechanical layout, which may be determined during a design process by one of ordinary skill.
- Display 203 may be supported and protected by a sturdy frame 202 , comprising any suitable structural material, such as plastic or metal. It may be assembled using various fasteners or adhesives as known in the art. An outer periphery of the frame 202 may project beyond the display surface 203 .
- the frame may be contoured or provided with a rubberized outer layer for comfortable holding.
- the control unit may be provided with a removable case (not shown), as generally known in the art for handheld appliances.
- An exterior of the frame 202 may be provided with various interface features, for example, a retractable antenna 210 connected to a transmitter/receiver of the control unit.
- One or more interface ports 212 A, 212 B may be provided, for communicating with external devices.
- Various suitable port configurations are known in the art, for example, USB, serial, IEEE 1394, IEEE 802 (wireless), and so forth.
- a power port 214 may be provided for supplying power to system electronics.
- the control unit 100 may be used in a wireless mode to control model vehicles and accessories. Wired ports 212 A-B and 214 may be used during system off times, such as while recharging system batteries, reconfiguring system software, or configuring other system components. It may also be advantageous to supply the control unit 200 with a physical on/off switch as known in the art (not shown).
- the frame may be curved, contoured, include one or more hand grips or finger grips, or may comprise any other suitable shape.
- a different pointing device may be used, such as a joystick button or touchpad.
- the user interface may comprise a substantially keyless interface as shown, or in the alternative, one or more keys may be included.
- the display 203 is not limited to the rectangular shape depicted, and may be provided in any suitable shape.
- the control unit need not be provided as a handheld device, and may even be implemented using a general-purpose computer. However, it is desirable that a video display screen of some type comprise a prominent feature of the control unit, to facilitate a graphical pointing-based control method as described herein.
- the display screen of the control unit may be used to display a series of graphic images and icons.
- a user indicates desired control actions by interacting with the display using a graphical pointing device.
- the control unit determines the intended control action from the pointer input and state of the graphic display, and transmits an appropriate command or message to one or more selected model vehicles, accessories, or command interface units.
- FIGS. 3-10 provide examples of display screens for providing a user with a substantially keyless control interface.
- One of ordinary skill may readily program such graphic displays using various suitable computer languages. These figures merely exemplify the conceptual framework of the invention, and the invention is not limited to the particular images or arrangement of images described herein.
- One of ordinary skill may devise a virtually unlimited variety of images, icons, and arrangements for use with the invention. Indeed, it is anticipated that particular control units may be frequently updated with software providing new and different interactive graphic interfaces for the enjoyment of the model vehicle hobbyist.
- FIGS. 3-10 provide examples of generally text-free control interfaces.
- One advantage of a text free interface is that text need not be translated into different languages for use by different language speakers.
- graphical user interfaces may be made easier to learn or operate by the use of accompanying text messages or labels.
- the near absence of text in FIGS. 3-10 should not be taken to suggest that use of text is undesirable.
- Text may be used, for example in pop-up windows or labels, to facilitate understanding and navigation of the graphical control interface.
- FIGS. 3-10 show displays for a display screen 203 of a control unit 200 , the invention is not limited thereby. It is anticipated that displays may be provided in other shapes and proportions, and the concepts of the invention may readily be adopted to various different displays. For example, larger, more complex displays may be implemented on larger screens, such as on display monitors for general-purpose computers.
- FIG. 3 shows an exemplary display such as may be used for a top-level or upper level screen. That is, a user may be presented with a display as shown in FIG. 3 on a display 203 after turning the control unit on or first starting the control application, as a “home” screen. From this display, the user may be presented with options for opening various different lower-level screens. For example, by selecting icon 302 using a pointer action, a user may indicate a desire to select one or more model vehicles for control. In response, the control unit may generate a display containing one or more images of model vehicles, or graphics representing model vehicles, as shown in FIG. 4 . By selecting an image or icon representing a model vehicle, the user may indicate a desire to control that vehicle. One model vehicle is shown in FIG. 4 , but it should be appreciated that multiple vehicles may be shown on a single screen.
- a user may indicate a desire to control switch accessories of a model vehicle layout.
- icon 306 a user may indicate a desire to control other components of a layout, such as accessories.
- Icon 308 may be used to open screens for configuring the interface, updating information regarding model vehicles and accessories controlled by the system, setting user preferences for system operation, and so forth. Button icons for initiating traditional navigation functions for graphical user interfaces may also be provided. For example, a “back” button 310 or a “forward button” 312 may be provided, for calling previously displayed screens as known in the art.
- Icon 314 may be used to indicate a desire to place the control unit in an emulation mode, calling for a display emulating the function of a prior-art control unit, for example, a Lionel TrainMasterTM control unit.
- a cursor 305 may be used to indicate a current position of the pointing device. In the alternative, the cursor may be omitted.
- Other ways of indicating cursor actions or menu status may be used, for example, “graying” inactive icons, highlighting selected icons, providing pop-up windows in response to pointer input, and so forth.
- FIG. 4 shows an exemplary display for controlling a model vehicle, such as a model train, on a display screen 203 of a control unit 200 .
- the display includes an image of a model vehicle 321 .
- Two cars, a locomotive 316 and tender 318 are shown. Any number of connected cars of various types may also be displayed together.
- Certain regions of the display may be configured as active regions responsive to pointer input, as known in the art. For example, by clicking or otherwise indicating region 320 , the headlamp may be toggled on or off. Region 322 may be used to toggle on or off a smoke generator, or to call up a screen for more complex control option. Icon 324 or a region of the image may be used to turn off or on a sound generator. Different icons may be used to indicate different sounds. For example, a bell icon 326 may be used to sound the engine's bell. In the alternative, or in addition, the type of sound may be determined by the location of a sound icon or image region. For example, icon 324 may be used to trigger a horn sound; icon 332 may trigger engineer sounds, and icon 334 may trigger coal loading sounds.
- wheels 328 may indicate an active region for controlling movement of the model vehicle. For example, forward motion may be initiated by clicking and dragging the wheels forward. The action may be repeated to increase velocity. Clicking and dragging backward may slow the train, or send it in a reverse direction, depending on its initial velocity. In the alternative, velocity may be indicated by a rate or direction at which the pointer is moved. For example, moving the pointer up may cause an increase in engine power, while moving it down may cause a decrease. Any other suitable pointer action may also be interpreted to control vehicle movement, as desired.
- the display may be provided with any desired navigation icon buttons, for example, back button 310 and forward button 312 , as previously described.
- a home button 338 may be provided, which provides a direct link to an opening or “home” screen.
- Other navigation icon buttons may include shortcut buttons 340 , 342 , and 344 .
- Button 340 may provide a shortcut to a vehicle selection page, which may permit selection of a different system vehicle for control.
- Button 342 may provide a link to a page for selecting switch accessories for control, and button 344 may similarly provide for selection of other accessories.
- the control unit 200 may be configured to provide a current indication of the state of the model object, or of the last command sent to the model object.
- the control unit may determine a current state from information received from a model object, from the last command sent to the model object, or from some combination of the foregoing.
- the display may then be altered to reflect a current object or command state.
- FIG. 5 shows indicators 346 and 348 such as may be added to an engine 316 display to indicate an operational state of a headlamp or a smoke generating unit, respectively.
- Indicator 346 may suggest a beam of light emanating from a lamp, and may be present when the lamp indicator is in an ON state.
- Indicator 348 may suggest puffs of smoke emanating from a smokestack of engine 316 when the model vehicle smoke unit is turned on. A great variety of other indicators may also be provided, as desired.
- features of the display may be animated to indicate a state of the model vehicle under control.
- wheels 328 may be animated so as to appear to turn at a rate proportional to engine power.
- Couplers may be animated to indicate a coupling or decoupling action.
- Animated figures, such as train engineers, may be made to perform actions that correlate to sounds being generated and commanded vehicle actions.
- animation effects may be added merely for decorative effect, independently of vehicle or command state, if desired.
- the display may be provided with icons or active regions for linking to related displays providing access to richer, more complex features of the model vehicle and system software.
- an active region 330 may indicate a control panel designed to suggest controls or gauges for the modeled locomotive. After selecting this region using a defined pointing action, the display may be changed to provide an enlarged view of the modeled control panel 350 , shown in FIG. 6 .
- Display 350 may comprise numerous gauges which may change depending on the actual engine state, the command state of the software, or some combination of the foregoing. Based on such inputs, the control unit may determine simulated engine values designed to model engine conditions that may occur in an actual engine.
- gauge 352 may indicate train speed, gauge 354 steam pressure, gauge 356 furnace temperature, gauge 358 engine oil pressure, and gauge 360 brake pressure.
- the control panel may also be provided with various command icons or active regions to emulate actual controls available on the modeled locomotive, such as steam or fuel valves, brake levers, throttles, etc., in response to pointer input.
- One of many such possible controls is indicated diagrammatically at 362 .
- the control unit may therefore serve as a vehicle simulator, with the model vehicle responding to user input via a control panel 350 .
- the control unit may thus provide multiple different ways for control of the model vehicle via the graphical display 203 .
- a system display may be provided with scrolling capability, to scroll the display horizontally or vertically in response to user input.
- a right arrow 336 is indicated at the right of the tender car 318 .
- the arrow points to the right to indicate that additional cars exist behind the tender.
- the display may scroll left or otherwise be changed to show the next cars 366 , 368 behind last car of the previous display, as shown in FIG. 7 .
- Scroll arrows 364 and 336 may similarly be activated to access other cars, with left arrow 364 used for access to cars in front of car 366 and right arrow 336 used as before.
- any other suitable method may be used to select and control different cars of a model vehicle.
- FIG. 7 also illustrates control of train cars such as may be provided with controllable features.
- a boxcar 366 may generate sound by activating icon 370 , or may operate its doors in response to activation of active door region 372 .
- hopper car 368 may play loading sounds in response to selection of icon 376 , or may open or shut its hopper doors in response to activation of active region 374 .
- a coupler image region 373 may be activated to decouple cars if equipped with an electronic decoupler.
- Each trailing car may be provided with its own receiver/control module, e.g., a CAB-1 or CAB- 2 unit, to receive and execute commands from the control unit 200 .
- a receiver/controller in one car may be used to control accessories in multiple cars via a coupling or other electrical connectors for connecting active elements of trailing cars.
- FIG. 8 shows an exemplary display for control of switch accessories, such as may be used to switch model trains from one section or block of track to another.
- switch accessories such as may be used to switch model trains from one section or block of track to another.
- Four icons 380 A-D are shown, each corresponding to a different switch accessory. Each icon may be identified by a label, from which the corresponding switch's position in the model layout may be determined.
- Right scroll arrow 336 indicates that additional switch accessories may be accessed by activation of the display scrolling feature.
- the current position of a switch may be indicated by an icon or other graphic representation. For example, icon 378 A indicates that switch 380 A is in a “through” position. Icon 378 B indicates that switch 380 B is in an “out” position.
- the position of a switch may be changed via interacting with an active region or icon. For example, active region 382 may be selected to place switch 380 A in a through position. Region 384 may be selected to place switch 380 A in an out position.
- a graphic representation of a model track layout such as to show the position of switches or other accessories.
- selecting a switch icon may cause a layout display as shown in FIG. 9 to appear.
- the display may comprise a map 386 of a model track layout.
- the layout may include a plurality of switches represented by corresponding symbols, such as symbols 388 A-B.
- the symbols may comprise active icons or regions.
- Map 386 may further comprise one or more accessory symbols 392 , showing the location of decorative accessories.
- a layout or model vehicles may be equipped with position sensors or locating devices, which may make it possible to show the position of one or more model vehicles, such as shown using symbol 390 . Any desired element of map 386 may comprise an active object for initiating a control command.
- a hobbyist may construct and update a model layout map using an associated application running on the control unit 200 or other computer system.
- a layout display as shown in FIG. 9 may also be useful for configuring a control system.
- a track layout may be used to conveniently program one or more routes within the layout.
- Each route comprises a set of switch settings that cause a model vehicle to move along a specified route of track.
- a user may conveniently set multiple switches at once by selecting a specified route.
- a route may be displayed in a manner similar to FIG. 9 , as well.
- switch displays as shown in FIG. 8 , or any other desired display, may be used.
- a layout display as shown in FIG. 9 may be useful for showing the results of system diagnostic tests.
- track voltages may vary across different sections of track because of connection impedance between the track section, or between a model vehicle and a particular track section.
- a voltage sensor on a model vehicle may be used to detect received voltage as the vehicle moves around the track. These voltages may be recorded for different track positions and mapped using a layout mapped as shown in FIG. 9 .
- the vehicle voltage may be displayed using a digital or simulated analog (e.g., a waveform) display on the control unit 200 as the model vehicle moves around the track.
- a hobbyist may sometimes desire to control trains or other model vehicles using a traditional control interface, such as a Lionel TrainMasterTM interface.
- a control unit 200 may be configured to emulate a traditional control interface 400 , as shown in FIG. 10 .
- the emulated interface 400 may include graphic representations of keys, dials, or other objects found on a traditional interface. Such representations may comprise active objects responsive to pointer input in a manner that emulates a traditional interface.
- control unit 200 may control model vehicles and accessories using both traditional and graphical interfaces.
- a display screen 203 of a control unit 200 may be provided with a piezoelectric layer 402 disposed to respond to finger touches.
- Layer 402 may be substantially transparent and disposed on top of a display screen, such as on an LCD display.
- the piezoelectric layer may be disposed under a protective layer or display screen.
- the piezoelectric layer 402 may be associated with a touchscreen as previously described.
- the touchscreen may comprise one or more active regions 404 , each aligned with a respective image of a traditional control actuator, e.g., button image 406 or dial image 408 .
- Piezoelectric layer 402 may be configured to vibrate and thereby provide auditory or tactile feedback in response to finger or stylus pressure. For example, the feel or sound of actions such as button presses, slider or dial movements, or button taps may be simulated.
- piezoelectric elements comprising metal oxides may be sandwiched between two electrodes. When a voltage is applied to the electrodes, the piezoelectric elements expand in the direction of the voltage. The voltage may be alternated at different frequencies to provide a desired auditory or tactile feedback in response to touchscreen input.
- a user may perform a method 500 for controlling a model vehicle, exemplary steps of which are shown in FIG. 11 .
- a graphic display screen is displayed.
- the user interacts with one or more objects or regions depicted on the screen, and at step 504 , a system receives the pointing input.
- a processor determines an intended user command from the pointing input received, depending on a current state of the display.
- displayed objects or regions are active, in that a program module is configured to react to defined pointer actions by passing specified information (e.g., information specifying a model vehicle or accessory command) to another software module for communicating the command to the model object.
- specified information e.g., information specifying a model vehicle or accessory command
- Other methods for determining an intended user command from pointer input and a state of the display may also be suitable.
- a command protocol may be selected. This may occur concurrently with step 506 .
- a command protocol for the model object may likewise be associated with the screen object.
- both a command and a command protocol may be selected.
- Screen objects may be associated with commands or protocols during a configuring step, which may be performed prior to controlling model objects using the control unit. Other methods for selecting a command protocol may be suitable.
- the command may be translated to the selected protocol, if necessary, and transmitted to the model object at step 510 .
- a return signal may be received from the model object.
- the return signal may comprise an acknowledgement that a command has been received.
- Return information may also concern the state of the model vehicle, e.g., voltage or velocity information.
- the graphics display may be refreshed to reflect a change in state of the model vehicle, such as to show that command has been received or executed.
- a system control unit may be configured with information concerning model vehicles, switches, and other accessories.
- the control unit may broadcast an inquiry, causing model vehicles within range that are using a compatible command protocol to respond.
- the control unit may process the responses received to update model object information within a control database.
- Such broadcasts, or “scanning,” may occur at periodic intervals, or upon the occurrence of specified conditions.
- model objects may be configured to transmit information to a control unit when the model object is first powered up, or after a “reset” operation occurs.
- a user may configure a control database by manually entering information via a suitable user interface.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Toys (AREA)
Abstract
Description
- 1. Field of the Invention
- The present invention relates to a method and system for remote control of a model vehicle, such as a model train.
- 2. Description of Related Art
- Various methods and systems are known for the remote control of model vehicles, such as model trains. Many systems include a control unit having a user interface. The user interface may include one or more keys, buttons, levers, knobs or the like, for obtaining user input used in control of the model vehicle. For example, a control unit for a model train may include a dial that can be turned for speed control, and numerous buttons for functions such as on/off, train sounds, smoke generators, lights, model track switches and accessories. Control units may also include a display screen for display of text messages, and one or more lights, sound generators, or other system state indicators. Control units for model trains often comprise small hand-held devices, for the convenience of the user. Stationary control interfaces may also be known.
- Remote control interfaces for model trains may communicate with model trains and accessories using various methods. One approach is to provide control signals from a remote control unit via a wired or wireless connection to a base station, which is connected to a model railway track. The base station may translate received control signals into one or more suitable command protocols for a targeted model train or accessory. For example, it is known to send DC offset signals superimposed with AC track power through an electrified rail of the model railway. Model trains or accessories receiving power through the rail may then receive the DC offset signals, decode them using an onboard controller, and respond appropriately.
- Another approach is to transmit a radio-frequency (RF) signal through a rail of the model track, essentially using the model track as a near-field antenna for an RF signal that fades rapidly with distance from the track. For example, in Lionel trains, the signal used is a 455 Khz frequency shift keyed (FSK) signal at 5 volts peak-peak. This signal creates a field along the length of the track, detectable within a few inches of the track. Trains or accessories are equipped with a receiver for the RF signal, and a microprocessor that implements commands addressed to a particular train car or track accessory. Lionel manufactures a system of remote controllers and compatible modular receiver and control units referred to as a “CAB-1” remote controllers and modules, respectively. CAB-1 modules may be added to Lionel trains and accessories via a standard connector socket, to enable control of a train or accessory using FSK signaling and a CAB-1 remote controller.
- Similarly, an RF or infrared (IR) signal may be broadcast directly to a train or accessory through the air from a base station or remote control unit, received by a receiver on the train or accessory, and processed by a control unit. Such methods are also known for controlling model cars, boats, planes, and other such vehicles that operate without the use of a rail. Other methods of communicating with model trains and vehicles for control purposes are also known.
- The development of various different control methods for model trains over the years has lead to a situation where many hobbyists own model trains controlled in different ways, sometimes sharing the same model track layout. And in general, many train hobbyists operate several trains and accessories together on the same layout, whether or not using different control methods. Either way, separate control units may be used to control different trains on the same layout, requiring the user to keep track of numerous separate units. Control units for controlling multiple trains and accessories using a single interface are known, but such units make use of keypad interfaces that require a user to memorize specific command sequences to activate specific commands addressed to specific trains or accessories on the layout. As trains and accessories become increasingly complex with ever-increasing control features, memorizing and executing numerous different keystroke commands becomes an increasingly onerous chore for the hobbyist.
- It is desirable, therefore to provide a control unit and user interface for controlling model trains and accessories that overcomes these and other limitations of prior-art control units.
- The present invention provides a control unit and interface that may be used to control multiple model vehicles and accessories via a single interface, without requiring the user to enter any keystroke commands or to memorize control sequences. In addition, the control unit may be used with multiple different control methods, including but limited to DC offset control, RF control, including but not limited to FSK control, and IR control. When equipped with suitable interface units, the control unit may be used to control model vehicles and accessories using different control systems. Accordingly, a control unit according to the invention may be used by a hobbyist to control a variety of different types of trains and accessories, or other model vehicles, using a single interface.
- In an embodiment of the invention, the control unit comprises a graphical display screen, a processor executing graphical user interface (GUI) software operative to display graphical images on the display screen, a pointing device operative to provide user input to the processor, and a model vehicle control output. A pointing device may comprise, for example, a touchscreen responsive to finger or stylus touches, a mouse, trackball, touchpad, joystick button, and so forth. The control unit may be implemented using a special-purpose computer, such as an integrated controller, display screen, pointing device, and control output of a hand-held (or desktop) control unit. In the alternative, the control unit may be implemented using standard displays, pointing devices, and outputs of a general purpose computer, using the computer's microprocessor to execute suitably configured control software. One of ordinary skill may implement the novel user interface and control system as disclosed herein in other ways, as well.
- In an embodiment of the invention, the invention comprises software executed on a processor of the control unit. The software is operative to cause graphical displays to appear on the display screen, and generates appropriate commands for controlling trains and accessories in response to input from the pointing device. For example, the software may cause an image or graphical representation of a model train to be controlled to appear on the display screen. Using the pointing device, a user may perform actions such as touching a wheel, touching a light, smokestack, tender, or other part of the train, clicking and dragging the train forwards or backwards, and so forth. The controller receives the pointing device input which is interpreted by the software, such as by using a lookup table or object-oriented program structure, as correlating to specific desired commands. The software may then cause an appropriate command to be transmitted to the model train or accessory using a selected transmission path.
- The software may be configured such that actions of the pointing device correlate intuitively to commands or to actions of trains or accessories, and the display correlates intuitively to the unit under control. For example, clicking and dragging an image of a train forwards or backwards may cause the selected train to move in the indicated direction. A commanded velocity may correlate to a speed or direction of cursor movement, or to a stylus pressure. Selecting images of a component may cause the corresponding component of the model train or accessory to operate. Selecting the component again may cause it to toggle off or on, depending on its last state. Images may be modified to indicate the state of the train or accessory. For example, images of a turning wheel may indicate motion of a train. Similarly, the state of smoke emission devices, lights, switches, and other accessories may be indicated using a image or icon.
- Thus, a user may operate a complex system of multiple vehicles and accessories without needing to learn a command language or navigation of text menus. In the alternative, or in addition, a control unit according to the invention may be configured to emulate any number of legacy controllers, for users who prefer using a legacy control interface.
- A display screen of a control unit may be provided with a piezoelectric layer disposed to respond to finger touches. The piezoelectric layer may be configured to vibrate and thereby provide auditory or tactile feedback in response to finger or stylus pressure. For example, the feel or sound of actions such as button presses, slider or dial movements, or button taps may be simulated. Vehicle sounds, movement, and vibration may also be simulated using the piezoelectric layer.
- A more complete understanding of the method and system for model vehicle and accessory control will be afforded to those skilled in the art, as well as a realization of additional advantages and objects thereof, by a consideration of the following detailed description of the preferred embodiment. Reference will be made to the appended sheets of drawings which will first be described briefly.
-
FIG. 1A is a block diagram showing an exemplary system according to the invention. -
FIG. 1B is a block diagram showing an exemplary handheld control unit in a system including legacy control units. -
FIG. 2 is a break-away perspective view showing a handheld control unit according to the invention. -
FIGS. 3-10 are diagrams showing various exemplary graphical displays for a control interface according to the invention. -
FIG. 11 is a flow chart showing exemplary steps of a method according to the invention. - The present invention provides a method and system for model vehicle and accessory control, that overcomes the limitations of the prior art. In the detailed description that follows, like element numerals will be used to indicate like elements appearing in one or more of the figures.
-
FIG. 1A shows asystem 101 for controlling a model vehicle or accessory, comprising acontrol unit 100 in communication with amodel vehicle 120, switchaccessory 122, anddecorative accessory 124. In an embodiment of the invention, thecontrol unit 100 may be configured as a handheld device. In the alternative, or in addition, the control unit may be configured as a desktop or layout-mounted device. It should also be possible to implement the control using a general purpose computer, which may reduce the need for additional hardware or provide a larger display or more powerful processing engine than might otherwise be possible. It is therefore anticipated that some users may implement thecontrol unit 100 in more than one way, to take advantage of the characteristics of different hardware platforms. -
Control unit 100 may comprise anysuitable processor 102 as known in the art. Various standard semiconductor devices for digital processing, and circuits incorporating such devices, are known in the art. Theprocessor 102 should be operatively associated with amemory 108 for holding data and program instructions to be executed byprocessor 102.Processor 102 may incorporate various components such as bus controllers, graphic controllers, and other auxiliary components as known in the art. From the description herein, one of ordinary skill should be able to suitably configureprocessor 102, including providing it with software or other programming instructions using any suitable computer language. Suitable languages may include, for example, C, C++, Visual Basic, Delphi, Python, Java, assembly language, machine language, or combinations of the foregoing with each other or with other languages. Libraries and routines for handling video displays and various input devices, including input from pointing devices, are generally available for these and other programming languages. - The control unit further comprises a
video display device 104 operatively associated with theprocessor 102. Various video display devices are known in the art, and include, for example, computer monitors configured to accept analog signals such as RGB composite, YPbPr component, or S-video, digital monitors and televisions configured for using DVI, HDMI, or other digital signal input, and televisions such as use a NTSC, PAL, SECAM, or High Defintion television signal as input. Digital monitors and televisions often include an adaptor for accepting various analog audio-video (A/V) signals, as well. The display screen itself may comprise a cathode ray tube, liquid-crystal flat panel (LCD), gas plasma panel, or similar device. In an embodiment of the invention, a handheld-size LCD flat panel may be used. Thedisplay device 104 may also comprise electronics for driving a display screen using one or more types of video input signals, and for providing audio output, as known in the art. - A
graphical pointing device 106 should also be communicatively connected toprocessor 102, to provide an indication of position, movement, pressure, or other pointing characteristics.Pointing device 106 may comprise any suitable hardware or software for sensing position, motion, or touch pressure of a physical object. Such sensors may provide sensor input to a device driver, which provides an interface between the sensor and an application running onprocessor 102. Pointing devices may provide an indication of two-dimensional position or movement of an object, for example, a mechanical ball, optical sensor, human fingertip, stylus tip, or eyeball. In some cases, three-dimensional information, touch pressure, and rotational movement information may be provided as well. Various suitable pointing devices and associated software are known in the art, for example, optical or mechanical computer mice, trackballs, joysticks or pointing buttons, mouse and pen pads or tablets, touchpads, and touchscreens. Both touchpads and touchscreens comprise two-dimensional membranes that react to the touch of a finger or stylus tip and provide a signal indicative of position or movement. A touchscreen may be comprised as a transparent or translucent touchpad membrane disposed over a video display. -
Processor 102 should be configured to receive input from pointingdevice 106, and generate commands for controllingmodel vehicle 120, switchaccessory 122, ordecorative accessory 124 oflayout 114 in response to the pointing input. Software for the processor may be configured to provide commands according to separate command protocols, or a single command protocol. Various suitable command protocols for model vehicles and accessories are known in the art. For example, the Lionel TrainMaster™ Command Control makes use of a defined command set and protocol, as do other methods for remote control of model vehicles. - Although various suitable command protocols are known in the art, the novel method as disclosed herein for generating commands for a model vehicle based substantially on input from a pointing device may require developing new systems for correlating pointer actions in synchrony with a video display. For example, “grabbing” an image of a model vehicle wheel on a video display and “dragging” it forward may be correlated with a known command for “start forward motion,” and so forth. The invention is not limited to any particular set or system of correlations between pointer actions and vehicle or accessory commands. In general, software for correlating pointer actions to commands or other output is known from other contexts in computer programming, and may be readily adapted by one of ordinary skill to practice the invention.
-
Control unit 100 may further include atransmitter 110 for transmitting a wireless signal to modelobjects units layout 114. Optionally, the control unit may also comprise a receiver for receiving information from the model objects. A receiver may be located withtransmitter 110 or in a separate component. -
Transmitter 110 may be configured to transmit and receive information with the model objects via one or more transmission paths. For example, information and commands may be transmitted by over-the-air broadcast to a receiver located on the model objects 120, 122, 124. A suitable receiver may comprise, for example, a receiver and controller in a modular unit that may be connected via a removable connector to the power and operational devices of the model object. For its TrainMaster™ compliant (abbreviated herein as “TMCC”) model vehicles and accessories, Lionel provides CAB-1 receiver/controller modules. The CAB-1 receiver is configured to receive FSK signals emitted from the model track and is intended for use with TMCC-1 vehicles and accesories. - It is anticipated that CAB-1 modules may be adapted by one of ordinary skill to receive RF or IR over-the-air broadcast signals. A suitably adapted module for receiving RF transmissions may be referred to, for example, as a “CAB-2” module. CAB-2 modules may be provided with a common interface with CAB-1 modules. Thus, it is anticipated that CAB-2 and CAB-1 modules may be used interchangeably in a system according to the invention. Model vehicles and accessories may be upgraded from TMCC-1 to TMCC-2 merely by replacing a CAB-1 receiver module with a CAB-2 module. In addition,
transmitter 110 and other components of the system may be configured to support CAB-2 transmission standards while providing backward compatibility to CAB-1 or other standards. The hobbyist may thereby usecontrol unit 100 to control model vehicles and accessories of different types on the same layout. - In the alternative, or in addition, the
control unit 110 may communicate with model objects via an FSKcommand interface unit 126 connected to the model layout. Yet another alternative is to communicate via acommand interface 128 for any other command protocol, such as DC-offset signaling.Command interface units control unit 100 and to generate commands in any desired protocol for model objects inlayout 114 capable of receiving and operating accordingly. Command units as known in the art for FSK signaling, DC-offset control, or other protocols may be adapted by one of ordinary skill to receive and process information from thecontrol unit 100. In the alternative, thecontrol unit 100 may be configured to transmit information in a manner already compatible with prior-art command units. Using any suitable method as described above, asingle control unit 100 may control model objects using different protocols. Accordingly, a suitably configured control unit may be used to replace numerous otherwise incompatible prior-art control units for controlling any desired number of different types of model objects onlayout 114. - For DC-offset control,
layout 114 may compriseseparate control blocks legacy command units 128 may be separately connected. Power may be supplied through aconventional power supply 130. -
Processor 102 may also be configured to communicate via awireless router 140 or other network device through anetwork 142 to aremote server 144. This connectivity may be used, for example, to update operating software inmemory 108 as improvements are made, or as new models of model vehicles and accessories are released. The control unit may be configured to automatically check, download, or install software updates at periodic intervals, or in response to changes insystem 101. In the alternative, or in addition, a user may manually initiate any of these update processes. Information may also be transmitted from thecontrol unit 100 to theserver 144, for example, current version information or system state, or an acknowledgment that an update has been successfully installed may be transmitted. -
FIG. 1B shows an exemplary TMCC-2handheld control unit 100 according to the invention, in a control system together with aTMCC1 controller 150.Controller 150 transmits control signals wirelessly to anFSK base 152.Base 150 may be configured as a TMCC-1 base, which receives TMCC-1 command signals, translates or transforms the signals to an FSK-compatible format 160 and broadcasts the signals from a rail of alayout 114.Vehicle 120, being equipped with a CAB-1 receiver (not shown), receives the FSK control signals vianear field 166, and operates accordingly. The TMCChandheld control unit 150 also wirelessly transmits control signals to a TMCC-1power control unit 154, which modulates AC phase-controlledpower 162 tolayout 114 in response to the control signals, for speed control of thevehicle 120. In the alternative, or in addition,handheld control unit 150 may transmit control signals to a TMCC-2power control unit 156.Power control unit 156 may be backwardly compatible with the TMCC-1 command set, while providing additional TMCC-2 capabilities as described below for use with TMCC-2handheld controller 100. - TMCC-2
controller 100 may be used instead of, or in addition to,controller 150 to control model vehicles in thesame layout 114.Controller 100 may be configured for backward compatibility with the TMCC-1 standard, and thus, may transmit TMCC-1 control signals tobase 152, which then may operate exactly as described in connection withcontroller 150. Likewise,controller 100 may transmit a TMCC-1 signal topower control unit 154, for control of AC power in the same manner as used with TMCC-1controller 150. Ifvehicle 120 or any other vehicle onlayout 114 is equipped with a CAB-2 receiver/transmitter,control unit 100 may wirelessly communicate directly with the so-equipped vehicle, as described in connection with the precedingFIG. 1A . Communication with the CAB-2 transmitter may be two-way, for example commands may be transmitted tovehicle 120, and data or acknowledgements transmitted from thevehicle 120 tocontroller 100. -
Control unit 100 may also be used to transmit and receive from a TMCC-2power control unit 156 for control of AC phase-controlledpower 164 tolayout 114.Power unit 156 may comprise a transmitter for transmitting data back tohandheld unit 100. Such data may include, for example, voltage, current, or phase information measured at one or more connection points to a power supply rail oftrack layout 114. In an embodiment of the invention, a CAB-2 equipped vehicle may transmit data regarding track power received by the vehicle to thepower control unit 156, which may relay it tounit 100. In the alternative, or in addition,power control unit 156 may include a processor configured to implement a feedback control loop for power control, such as a PI or PID control loop, to more accurately control power supplied to thetrack 114 orvehicle 120. Power information may also be relayed or transmitted directly toremote unit 100 for diagnostic or control purposes. - While FIGS. 1A-B illustrate embodiments of control units employing multiple control protocols and capabilities, the invention is not limited to the specific protocols and control capabilities disclosed. An important aspect of the invention concerns the use of a graphical user interface (GUI) in a control system or method. This aspect of the invention may be practiced independently from, or in conjunction with, the multiple control protocols and capabilities exemplified above. Further details pertaining to use of a graphical user interface for control are described in the specification below.
-
FIG. 2 shows an exemplary physical package for a handheldgraphical control unit 200.Control unit 200 comprises a prominentvideo display screen 203 for display of graphic images. In an embodiment of the invention, the control unit comprises a substantially keyless interface, relying on the display screen and a pointing device as the primary control interface. The display screen may comprise aflat panel LCD 206 as known in the art, or other suitable display. A substantially transparent or translucent touchscreenmembrane sensor element 204 may overlay theLCD 206 and be configured as a pointing device, as known in the art. Suitable touchscreen membranes may comprise, for example, capacitive touchscreens or resistive touchscreens, or more advanced technologies such as IR scanning or surface area wave (SAW) touchscreens. The membrane element may be connected to a processor (e.g.,processor 100 shown inFIG. 1A ) via a universal serial bus (USB), serial, or other suitable connection via a suitable interface device, as known in the art. - The
control unit 200 may also comprise one ormore electronics boards 208 for holding system electronics, such as processors, transmitter/receivers, display and pointer interfaces, memory devices, batteries, and so forth. The arrangement of internal components may be accomplished in any suitable manner by one of ordinary skill. While the depicted embodiment suggests a general anticipated arrangement of elements, one or ordinary skill should understand that it is not intended to depict actual details of a mechanical layout, which may be determined during a design process by one of ordinary skill. -
Display 203 may be supported and protected by asturdy frame 202, comprising any suitable structural material, such as plastic or metal. It may be assembled using various fasteners or adhesives as known in the art. An outer periphery of theframe 202 may project beyond thedisplay surface 203. The frame may be contoured or provided with a rubberized outer layer for comfortable holding. Optionally, the control unit may be provided with a removable case (not shown), as generally known in the art for handheld appliances. - An exterior of the
frame 202 may be provided with various interface features, for example, aretractable antenna 210 connected to a transmitter/receiver of the control unit. One ormore interface ports power port 214 may be provided for supplying power to system electronics. In an embodiment of the invention, thecontrol unit 100 may be used in a wireless mode to control model vehicles and accessories.Wired ports 212A-B and 214 may be used during system off times, such as while recharging system batteries, reconfiguring system software, or configuring other system components. It may also be advantageous to supply thecontrol unit 200 with a physical on/off switch as known in the art (not shown). - Many other different forms of physical package may also be suitable, besides the
exemplary embodiment 200 shown inFIG. 2 . For example, instead of a generally rectangular frame, the frame may be curved, contoured, include one or more hand grips or finger grips, or may comprise any other suitable shape. In the alternative to, or in addition totouchscreen 204, a different pointing device may be used, such as a joystick button or touchpad. The user interface may comprise a substantially keyless interface as shown, or in the alternative, one or more keys may be included. Thedisplay 203 is not limited to the rectangular shape depicted, and may be provided in any suitable shape. In addition, the control unit need not be provided as a handheld device, and may even be implemented using a general-purpose computer. However, it is desirable that a video display screen of some type comprise a prominent feature of the control unit, to facilitate a graphical pointing-based control method as described herein. - The display screen of the control unit may be used to display a series of graphic images and icons. A user indicates desired control actions by interacting with the display using a graphical pointing device. The control unit determines the intended control action from the pointer input and state of the graphic display, and transmits an appropriate command or message to one or more selected model vehicles, accessories, or command interface units.
FIGS. 3-10 provide examples of display screens for providing a user with a substantially keyless control interface. One of ordinary skill may readily program such graphic displays using various suitable computer languages. These figures merely exemplify the conceptual framework of the invention, and the invention is not limited to the particular images or arrangement of images described herein. One of ordinary skill may devise a virtually unlimited variety of images, icons, and arrangements for use with the invention. Indeed, it is anticipated that particular control units may be frequently updated with software providing new and different interactive graphic interfaces for the enjoyment of the model vehicle hobbyist. -
FIGS. 3-10 provide examples of generally text-free control interfaces. One advantage of a text free interface is that text need not be translated into different languages for use by different language speakers. However, it is generally recognized that graphical user interfaces may be made easier to learn or operate by the use of accompanying text messages or labels. The near absence of text inFIGS. 3-10 should not be taken to suggest that use of text is undesirable. Text may be used, for example in pop-up windows or labels, to facilitate understanding and navigation of the graphical control interface. - In addition, although
FIGS. 3-10 show displays for adisplay screen 203 of acontrol unit 200, the invention is not limited thereby. It is anticipated that displays may be provided in other shapes and proportions, and the concepts of the invention may readily be adopted to various different displays. For example, larger, more complex displays may be implemented on larger screens, such as on display monitors for general-purpose computers. -
FIG. 3 shows an exemplary display such as may be used for a top-level or upper level screen. That is, a user may be presented with a display as shown inFIG. 3 on adisplay 203 after turning the control unit on or first starting the control application, as a “home” screen. From this display, the user may be presented with options for opening various different lower-level screens. For example, by selectingicon 302 using a pointer action, a user may indicate a desire to select one or more model vehicles for control. In response, the control unit may generate a display containing one or more images of model vehicles, or graphics representing model vehicles, as shown inFIG. 4 . By selecting an image or icon representing a model vehicle, the user may indicate a desire to control that vehicle. One model vehicle is shown inFIG. 4 , but it should be appreciated that multiple vehicles may be shown on a single screen. - Referring again to
FIG. 3 , by selectingicon 304, a user may indicate a desire to control switch accessories of a model vehicle layout. By selectingicon 306, a user may indicate a desire to control other components of a layout, such as accessories.Icon 308 may be used to open screens for configuring the interface, updating information regarding model vehicles and accessories controlled by the system, setting user preferences for system operation, and so forth. Button icons for initiating traditional navigation functions for graphical user interfaces may also be provided. For example, a “back”button 310 or a “forward button” 312 may be provided, for calling previously displayed screens as known in the art.Icon 314 may be used to indicate a desire to place the control unit in an emulation mode, calling for a display emulating the function of a prior-art control unit, for example, a Lionel TrainMaster™ control unit. - A
cursor 305 may be used to indicate a current position of the pointing device. In the alternative, the cursor may be omitted. An addition, other ways of indicating cursor actions or menu status may be used, for example, “graying” inactive icons, highlighting selected icons, providing pop-up windows in response to pointer input, and so forth. -
FIG. 4 shows an exemplary display for controlling a model vehicle, such as a model train, on adisplay screen 203 of acontrol unit 200. The display includes an image of amodel vehicle 321. Two cars, a locomotive 316 andtender 318, are shown. Any number of connected cars of various types may also be displayed together. - Certain regions of the display may be configured as active regions responsive to pointer input, as known in the art. For example, by clicking or otherwise indicating
region 320, the headlamp may be toggled on or off.Region 322 may be used to toggle on or off a smoke generator, or to call up a screen for more complex control option.Icon 324 or a region of the image may be used to turn off or on a sound generator. Different icons may be used to indicate different sounds. For example, a bell icon 326 may be used to sound the engine's bell. In the alternative, or in addition, the type of sound may be determined by the location of a sound icon or image region. For example,icon 324 may be used to trigger a horn sound;icon 332 may trigger engineer sounds, andicon 334 may trigger coal loading sounds. - Similarly,
wheels 328 may indicate an active region for controlling movement of the model vehicle. For example, forward motion may be initiated by clicking and dragging the wheels forward. The action may be repeated to increase velocity. Clicking and dragging backward may slow the train, or send it in a reverse direction, depending on its initial velocity. In the alternative, velocity may be indicated by a rate or direction at which the pointer is moved. For example, moving the pointer up may cause an increase in engine power, while moving it down may cause a decrease. Any other suitable pointer action may also be interpreted to control vehicle movement, as desired. - The display may be provided with any desired navigation icon buttons, for example,
back button 310 andforward button 312, as previously described. In addition, ahome button 338 may be provided, which provides a direct link to an opening or “home” screen. Other navigation icon buttons may includeshortcut buttons Button 340 may provide a shortcut to a vehicle selection page, which may permit selection of a different system vehicle for control.Button 342 may provide a link to a page for selecting switch accessories for control, andbutton 344 may similarly provide for selection of other accessories. - The
control unit 200 may be configured to provide a current indication of the state of the model object, or of the last command sent to the model object. The control unit may determine a current state from information received from a model object, from the last command sent to the model object, or from some combination of the foregoing. The display may then be altered to reflect a current object or command state. For example,FIG. 5 showsindicators engine 316 display to indicate an operational state of a headlamp or a smoke generating unit, respectively.Indicator 346 may suggest a beam of light emanating from a lamp, and may be present when the lamp indicator is in an ON state.Indicator 348 may suggest puffs of smoke emanating from a smokestack ofengine 316 when the model vehicle smoke unit is turned on. A great variety of other indicators may also be provided, as desired. - Referring again to
FIG. 4 , features of the display may be animated to indicate a state of the model vehicle under control. For example,wheels 328 may be animated so as to appear to turn at a rate proportional to engine power. Couplers may be animated to indicate a coupling or decoupling action. Animated figures, such as train engineers, may be made to perform actions that correlate to sounds being generated and commanded vehicle actions. In addition, animation effects may be added merely for decorative effect, independently of vehicle or command state, if desired. - The display may be provided with icons or active regions for linking to related displays providing access to richer, more complex features of the model vehicle and system software. For example, an
active region 330 may indicate a control panel designed to suggest controls or gauges for the modeled locomotive. After selecting this region using a defined pointing action, the display may be changed to provide an enlarged view of the modeledcontrol panel 350, shown inFIG. 6 .Display 350 may comprise numerous gauges which may change depending on the actual engine state, the command state of the software, or some combination of the foregoing. Based on such inputs, the control unit may determine simulated engine values designed to model engine conditions that may occur in an actual engine. For example, gauge 352 may indicate train speed,gauge 354 steam pressure, gauge 356 furnace temperature, gauge 358 engine oil pressure, and gauge 360 brake pressure. The control panel may also be provided with various command icons or active regions to emulate actual controls available on the modeled locomotive, such as steam or fuel valves, brake levers, throttles, etc., in response to pointer input. One of many such possible controls is indicated diagrammatically at 362. - Using a display such as
display 350, the user may therefore control the model train similar to the way that an engineer might have controlled the actual modeled locomotive. The control unit may thus serve as a vehicle simulator, with the model vehicle responding to user input via acontrol panel 350. Depending on the skill and experience of the hobbyist, more or less sophisticated simulated control panels may provided. The control unit may thus provide multiple different ways for control of the model vehicle via thegraphical display 203. - Referring yet again to
FIG. 4 , a system display may be provided with scrolling capability, to scroll the display horizontally or vertically in response to user input. Aright arrow 336 is indicated at the right of thetender car 318. The arrow points to the right to indicate that additional cars exist behind the tender. In response to activation ofarrow 336 by any suitable defined pointing action, the display may scroll left or otherwise be changed to show thenext cars FIG. 7 . Scrollarrows left arrow 364 used for access to cars in front ofcar 366 andright arrow 336 used as before. Besides scrolling, any other suitable method may be used to select and control different cars of a model vehicle. -
FIG. 7 also illustrates control of train cars such as may be provided with controllable features. For example, aboxcar 366 may generate sound by activatingicon 370, or may operate its doors in response to activation ofactive door region 372. Likewise,hopper car 368 may play loading sounds in response to selection oficon 376, or may open or shut its hopper doors in response to activation ofactive region 374. Acoupler image region 373 may be activated to decouple cars if equipped with an electronic decoupler. A great variety of other cars may similarly be controlled. Each trailing car may be provided with its own receiver/control module, e.g., a CAB-1 or CAB-2 unit, to receive and execute commands from thecontrol unit 200. In the alternative, a receiver/controller in one car may be used to control accessories in multiple cars via a coupling or other electrical connectors for connecting active elements of trailing cars. -
FIG. 8 shows an exemplary display for control of switch accessories, such as may be used to switch model trains from one section or block of track to another. Fouricons 380A-D are shown, each corresponding to a different switch accessory. Each icon may be identified by a label, from which the corresponding switch's position in the model layout may be determined.Right scroll arrow 336 indicates that additional switch accessories may be accessed by activation of the display scrolling feature. The current position of a switch may be indicated by an icon or other graphic representation. For example,icon 378A indicates thatswitch 380A is in a “through” position.Icon 378B indicates thatswitch 380B is in an “out” position. The position of a switch may be changed via interacting with an active region or icon. For example,active region 382 may be selected to placeswitch 380A in a through position.Region 384 may be selected to placeswitch 380A in an out position. Thus, a plurality of switches may be controlled using a single display and intuitive graphical interface. - It may sometimes be desirable to show a graphic representation of a model track layout, such as to show the position of switches or other accessories. For example, selecting a switch icon (e.g.,
icon 380A) may cause a layout display as shown inFIG. 9 to appear. The display may comprise amap 386 of a model track layout. The layout may include a plurality of switches represented by corresponding symbols, such assymbols 388A-B. The symbols may comprise active icons or regions.Map 386 may further comprise one or moreaccessory symbols 392, showing the location of decorative accessories. A layout or model vehicles may be equipped with position sensors or locating devices, which may make it possible to show the position of one or more model vehicles, such as shown usingsymbol 390. Any desired element ofmap 386 may comprise an active object for initiating a control command. A hobbyist may construct and update a model layout map using an associated application running on thecontrol unit 200 or other computer system. - A layout display as shown in
FIG. 9 may also be useful for configuring a control system. For example, a track layout may be used to conveniently program one or more routes within the layout. Each route comprises a set of switch settings that cause a model vehicle to move along a specified route of track. Once configured, a user may conveniently set multiple switches at once by selecting a specified route. A route may be displayed in a manner similar toFIG. 9 , as well. In the alternative to using a display of a track layout to define a route, switch displays as shown inFIG. 8 , or any other desired display, may be used. - Still further, a layout display as shown in
FIG. 9 may be useful for showing the results of system diagnostic tests. For example, track voltages may vary across different sections of track because of connection impedance between the track section, or between a model vehicle and a particular track section. A voltage sensor on a model vehicle may be used to detect received voltage as the vehicle moves around the track. These voltages may be recorded for different track positions and mapped using a layout mapped as shown inFIG. 9 . In the alternative, or in addition, the vehicle voltage may be displayed using a digital or simulated analog (e.g., a waveform) display on thecontrol unit 200 as the model vehicle moves around the track. - A hobbyist may sometimes desire to control trains or other model vehicles using a traditional control interface, such as a Lionel TrainMaster™ interface. A
control unit 200 according to the invention may be configured to emulate atraditional control interface 400, as shown inFIG. 10 . The emulatedinterface 400 may include graphic representations of keys, dials, or other objects found on a traditional interface. Such representations may comprise active objects responsive to pointer input in a manner that emulates a traditional interface. Thus,control unit 200 may control model vehicles and accessories using both traditional and graphical interfaces. - In an embodiment of the invention, a
display screen 203 of acontrol unit 200 may be provided with apiezoelectric layer 402 disposed to respond to finger touches.Layer 402 may be substantially transparent and disposed on top of a display screen, such as on an LCD display. In the alternative, the piezoelectric layer may be disposed under a protective layer or display screen. Thepiezoelectric layer 402 may be associated with a touchscreen as previously described. The touchscreen may comprise one or moreactive regions 404, each aligned with a respective image of a traditional control actuator, e.g., button image 406 ordial image 408.Piezoelectric layer 402 may be configured to vibrate and thereby provide auditory or tactile feedback in response to finger or stylus pressure. For example, the feel or sound of actions such as button presses, slider or dial movements, or button taps may be simulated. - Various suitable methods of constructing a piezoelectric layer in conjunction with a display screen are known in the art, and any suitable method may be used. In an embodiment of the invention, piezoelectric elements comprising metal oxides may be sandwiched between two electrodes. When a voltage is applied to the electrodes, the piezoelectric elements expand in the direction of the voltage. The voltage may be alternated at different frequencies to provide a desired auditory or tactile feedback in response to touchscreen input.
- Using a system in accordance with the foregoing, a user may perform a
method 500 for controlling a model vehicle, exemplary steps of which are shown inFIG. 11 . Atstep 502, a graphic display screen is displayed. The user interacts with one or more objects or regions depicted on the screen, and atstep 504, a system receives the pointing input. - At
step 506, a processor determines an intended user command from the pointing input received, depending on a current state of the display. In an embodiment of the invention, displayed objects or regions are active, in that a program module is configured to react to defined pointer actions by passing specified information (e.g., information specifying a model vehicle or accessory command) to another software module for communicating the command to the model object. Other methods for determining an intended user command from pointer input and a state of the display may also be suitable. - At
step 508, a command protocol may be selected. This may occur concurrently withstep 506. For example, when a screen object is associated with a particular model object, a command protocol for the model object may likewise be associated with the screen object. Thus, when the user selects the screen object, both a command and a command protocol may be selected. Screen objects may be associated with commands or protocols during a configuring step, which may be performed prior to controlling model objects using the control unit. Other methods for selecting a command protocol may be suitable. - At
step 509, the command may be translated to the selected protocol, if necessary, and transmitted to the model object atstep 510. Atstep 512, a return signal may be received from the model object. For example, the return signal may comprise an acknowledgement that a command has been received. Return information may also concern the state of the model vehicle, e.g., voltage or velocity information. Atstep 514, the graphics display may be refreshed to reflect a change in state of the model vehicle, such as to show that command has been received or executed. - Prior to operation of a control method according to the invention, a system control unit may be configured with information concerning model vehicles, switches, and other accessories. In an embodiment of the invention, the control unit may broadcast an inquiry, causing model vehicles within range that are using a compatible command protocol to respond. The control unit may process the responses received to update model object information within a control database. Such broadcasts, or “scanning,” may occur at periodic intervals, or upon the occurrence of specified conditions. In the alternative, or in addition, model objects may be configured to transmit information to a control unit when the model object is first powered up, or after a “reset” operation occurs. Still further, a user may configure a control database by manually entering information via a suitable user interface.
- Having thus described a preferred embodiment of method and system for model vehicle and accessory control, it should be apparent to those skilled in the art that certain advantages of the within system have been achieved. It should also be appreciated that various modifications, adaptations, and alternative embodiments thereof may be made within the scope and spirit of the present invention. For example, a system as applied to model train control has been illustrated, but it should be apparent that the inventive concepts described above would be equally applicable to control of other model vehicles and accessories. The invention is defined by the following claims.
Claims (40)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/096,841 US20060226298A1 (en) | 2005-03-30 | 2005-03-30 | Graphical method and system for model vehicle and accessory control |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/096,841 US20060226298A1 (en) | 2005-03-30 | 2005-03-30 | Graphical method and system for model vehicle and accessory control |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060226298A1 true US20060226298A1 (en) | 2006-10-12 |
Family
ID=37082302
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/096,841 Abandoned US20060226298A1 (en) | 2005-03-30 | 2005-03-30 | Graphical method and system for model vehicle and accessory control |
Country Status (1)
Country | Link |
---|---|
US (1) | US20060226298A1 (en) |
Cited By (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060100753A1 (en) * | 2004-11-10 | 2006-05-11 | Katzer Matthew A | Model train control |
US20070052856A1 (en) * | 2005-06-02 | 2007-03-08 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware. | Composite image selectivity |
US20070078574A1 (en) * | 2005-09-30 | 2007-04-05 | Davenport David M | System and method for providing access to wireless railroad data network |
US20070076750A1 (en) * | 2005-09-30 | 2007-04-05 | Microsoft Corporation | Device driver interface architecture |
US20080065284A1 (en) * | 1998-06-24 | 2008-03-13 | Katzer Matthew A | Model train control system |
US20090054768A1 (en) * | 2007-08-24 | 2009-02-26 | Menachem Halmann | Method and apparatus for voice recording with ultrasound imaging |
US20090125287A1 (en) * | 2005-04-07 | 2009-05-14 | New York Air Brake Corporation | Multimedia Train Simulator |
US20090162814A1 (en) * | 2005-12-06 | 2009-06-25 | Andrew Warburton Swan | Video-captured model vehicle simulator |
US20090177606A1 (en) * | 2008-01-09 | 2009-07-09 | Robert Lee Angell | Risk assessment in a pre/post security area within an airport |
US20090177609A1 (en) * | 2008-01-09 | 2009-07-09 | Robert Lee Angell | Risk assessment in an area external to an airport |
US20090177615A1 (en) * | 2008-01-09 | 2009-07-09 | Robert Lee Angell | Risk assessment between airports |
US20090177605A1 (en) * | 2008-01-09 | 2009-07-09 | Robert Lee Angell | Risk assessment within an aircraft |
US20090177608A1 (en) * | 2008-01-08 | 2009-07-09 | Robert Lee Angell | Risk assessment in a gate area of an airport |
US20090267920A1 (en) * | 2008-04-24 | 2009-10-29 | Research In Motion Limited | System and method for generating a feedback signal in response to an input signal provided to an electronic device |
US20090284476A1 (en) * | 2008-05-13 | 2009-11-19 | Apple Inc. | Pushing a user interface to a remote device |
US7711458B2 (en) | 2000-04-03 | 2010-05-04 | Katzer Matthew A | Model train control system |
US20100109899A1 (en) * | 2008-11-05 | 2010-05-06 | Michael Scott Mitchell | Method and system for vital display systems |
US20100135641A1 (en) * | 2008-12-03 | 2010-06-03 | D-Box Technologies Inc. | Method and device for encoding vibro-kinetic data onto an lpcm audio stream over an hdmi link |
US20100231540A1 (en) * | 2009-03-12 | 2010-09-16 | Immersion Corporation | Systems and Methods For A Texture Engine |
US20100293462A1 (en) * | 2008-05-13 | 2010-11-18 | Apple Inc. | Pushing a user interface to a remote device |
US7870085B2 (en) | 2008-01-09 | 2011-01-11 | International Business Machines Corporation | Risk assessment between aircrafts |
US20110109549A1 (en) * | 2007-04-24 | 2011-05-12 | Irobot Corporation | Control System for a Remote Vehicle |
US20110145863A1 (en) * | 2008-05-13 | 2011-06-16 | Apple Inc. | Pushing a graphical user interface to a remote device with display rules provided by the remote device |
EP2364757A1 (en) * | 2010-03-11 | 2011-09-14 | Parrot | Method and device for remote control of a drone, in particular a rotary-wing drone |
FR2957265A1 (en) * | 2010-03-11 | 2011-09-16 | Parrot | Method for implementing remote control apparatus e.g. Iphone type cellular telephone used to remotely control helicopter, involves activating control command based on analysis results |
US20110265003A1 (en) * | 2008-05-13 | 2011-10-27 | Apple Inc. | Pushing a user interface to a remote device |
US20110261201A1 (en) * | 2008-11-25 | 2011-10-27 | Toyota Jidosha Kabushiki Kaisha | Input device, vehicle environment monitoring apparatus, icon switch selection method, and recording medium |
US20110296340A1 (en) * | 2010-05-31 | 2011-12-01 | Denso Corporation | In-vehicle input system |
CN102034341B (en) * | 2009-09-30 | 2012-12-12 | 金宝电子工业股份有限公司 | Control system and method for generating control picture by using control system |
US8463953B2 (en) | 2010-08-18 | 2013-06-11 | Snap-On Incorporated | System and method for integrating devices for servicing a device-under-service |
US8560168B2 (en) | 2010-08-18 | 2013-10-15 | Snap-On Incorporated | System and method for extending communication range and reducing power consumption of vehicle diagnostic equipment |
US8606383B2 (en) | 2005-01-31 | 2013-12-10 | The Invention Science Fund I, Llc | Audio sharing |
US8681225B2 (en) | 2005-06-02 | 2014-03-25 | Royce A. Levien | Storage access technique for captured data |
US8754779B2 (en) | 2010-08-18 | 2014-06-17 | Snap-On Incorporated | System and method for displaying input data on a remote display device |
US8804033B2 (en) | 2005-10-31 | 2014-08-12 | The Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US8902320B2 (en) | 2005-01-31 | 2014-12-02 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US20140360399A1 (en) * | 2013-06-11 | 2014-12-11 | BlueRail Trains LLC | Wireless model railroad control system |
US8964054B2 (en) | 2006-08-18 | 2015-02-24 | The Invention Science Fund I, Llc | Capturing selected image objects |
US8983785B2 (en) | 2010-08-18 | 2015-03-17 | Snap-On Incorporated | System and method for simultaneous display of waveforms generated from input signals received at a data acquisition device |
US8988537B2 (en) | 2005-01-31 | 2015-03-24 | The Invention Science Fund I, Llc | Shared image devices |
US9001215B2 (en) | 2005-06-02 | 2015-04-07 | The Invention Science Fund I, Llc | Estimating shared image device operational capabilities or resources |
US9076208B2 (en) | 2006-02-28 | 2015-07-07 | The Invention Science Fund I, Llc | Imagery processing |
US9082456B2 (en) | 2005-01-31 | 2015-07-14 | The Invention Science Fund I Llc | Shared image device designation |
US9093121B2 (en) | 2006-02-28 | 2015-07-28 | The Invention Science Fund I, Llc | Data management of an audio data stream |
US9117321B2 (en) | 2010-08-18 | 2015-08-25 | Snap-On Incorporated | Method and apparatus to use remote and local control modes to acquire and visually present data |
US9124729B2 (en) | 2005-01-31 | 2015-09-01 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US9167195B2 (en) | 2005-10-31 | 2015-10-20 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US20150306512A1 (en) * | 2003-11-26 | 2015-10-29 | Lionel Llc | Model Train Control System |
US9191611B2 (en) | 2005-06-02 | 2015-11-17 | Invention Science Fund I, Llc | Conditional alteration of a saved image |
US9283674B2 (en) | 2014-01-07 | 2016-03-15 | Irobot Corporation | Remotely operating a mobile robot |
US9311115B2 (en) | 2008-05-13 | 2016-04-12 | Apple Inc. | Pushing a graphical user interface to a remote device with display rules provided by the remote device |
US9330507B2 (en) | 2010-08-18 | 2016-05-03 | Snap-On Incorporated | System and method for selecting individual parameters to transition from text-to-graph or graph-to-text |
US20160147233A1 (en) * | 2014-11-20 | 2016-05-26 | Honda Motor Co., Ltd. | System and method for remote virtual reality control of movable vehicle partitions |
US20160194041A1 (en) * | 2011-12-16 | 2016-07-07 | Entro Industries, Inc. | Control System for Load Transportation Device |
US9451200B2 (en) | 2005-06-02 | 2016-09-20 | Invention Science Fund I, Llc | Storage access technique for captured data |
US9489717B2 (en) | 2005-01-31 | 2016-11-08 | Invention Science Fund I, Llc | Shared image device |
US9621749B2 (en) | 2005-06-02 | 2017-04-11 | Invention Science Fund I, Llc | Capturing selected image objects |
US9633492B2 (en) | 2010-08-18 | 2017-04-25 | Snap-On Incorporated | System and method for a vehicle scanner to automatically execute a test suite from a storage card |
USD786887S1 (en) * | 2013-04-19 | 2017-05-16 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
US9696803B2 (en) | 2009-03-12 | 2017-07-04 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
US9746923B2 (en) | 2009-03-12 | 2017-08-29 | Immersion Corporation | Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction |
US9819490B2 (en) | 2005-05-04 | 2017-11-14 | Invention Science Fund I, Llc | Regional proximity for shared image device(s) |
US9910341B2 (en) | 2005-01-31 | 2018-03-06 | The Invention Science Fund I, Llc | Shared image device designation |
US9927873B2 (en) | 2009-03-12 | 2018-03-27 | Immersion Corporation | Systems and methods for using textures in graphical user interface widgets |
US9942511B2 (en) | 2005-10-31 | 2018-04-10 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US9994243B1 (en) * | 2017-07-05 | 2018-06-12 | Siemens Industry, Inc. | Clear enclosure top dome for end of train device |
US10003762B2 (en) | 2005-04-26 | 2018-06-19 | Invention Science Fund I, Llc | Shared image devices |
US10007340B2 (en) | 2009-03-12 | 2018-06-26 | Immersion Corporation | Systems and methods for interfaces featuring surface-based haptic effects |
US10097756B2 (en) | 2005-06-02 | 2018-10-09 | Invention Science Fund I, Llc | Enhanced video/still image correlation |
US20190259080A1 (en) * | 2011-10-17 | 2019-08-22 | Johnson Controls Technology Company | Battery selection and feedback system and method |
US10564721B2 (en) | 2009-03-12 | 2020-02-18 | Immersion Corporation | Systems and methods for using multiple actuators to realize textures |
US10652719B2 (en) | 2017-10-26 | 2020-05-12 | Mattel, Inc. | Toy vehicle accessory and related system |
CN112269385A (en) * | 2020-10-23 | 2021-01-26 | 北京理工大学 | Cloud unmanned vehicle dynamics control system and method |
US11262899B2 (en) * | 2014-09-10 | 2022-03-01 | Lego A/S | Method for establishing a functional relationship between input and output functions |
US11471783B2 (en) | 2019-04-16 | 2022-10-18 | Mattel, Inc. | Toy vehicle track system |
US20230280890A1 (en) * | 2022-03-04 | 2023-09-07 | Ford Global Technologies, Llc | Methods and systems for vehicle interface control |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6437836B1 (en) * | 1998-09-21 | 2002-08-20 | Navispace, Inc. | Extended functionally remote control system and method therefore |
US20020167529A1 (en) * | 2001-05-10 | 2002-11-14 | Shao-Tsu Kung | Computer system for displaying object images |
US20020171382A1 (en) * | 2000-09-22 | 2002-11-21 | Tanner Christopher Mark | Software-driven motor and solenoid controller |
US20030051631A1 (en) * | 2001-07-31 | 2003-03-20 | Ring Timothy W. | Method of and an apparatus for using a graphical handheld computer for model railroad programming and control |
US20040239268A1 (en) * | 2002-11-27 | 2004-12-02 | Grubba Robert A. | Radio-linked, Bi-directional control system for model electric trains |
US7221113B1 (en) * | 2004-11-10 | 2007-05-22 | The Creative Train Company, Llc | Touch-sensitive model train controls |
-
2005
- 2005-03-30 US US11/096,841 patent/US20060226298A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6437836B1 (en) * | 1998-09-21 | 2002-08-20 | Navispace, Inc. | Extended functionally remote control system and method therefore |
US20020171382A1 (en) * | 2000-09-22 | 2002-11-21 | Tanner Christopher Mark | Software-driven motor and solenoid controller |
US20020167529A1 (en) * | 2001-05-10 | 2002-11-14 | Shao-Tsu Kung | Computer system for displaying object images |
US20030051631A1 (en) * | 2001-07-31 | 2003-03-20 | Ring Timothy W. | Method of and an apparatus for using a graphical handheld computer for model railroad programming and control |
US20040239268A1 (en) * | 2002-11-27 | 2004-12-02 | Grubba Robert A. | Radio-linked, Bi-directional control system for model electric trains |
US7221113B1 (en) * | 2004-11-10 | 2007-05-22 | The Creative Train Company, Llc | Touch-sensitive model train controls |
Cited By (135)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7912595B2 (en) | 1998-06-24 | 2011-03-22 | Katzer Matthew A | Model train control system |
US7904215B2 (en) | 1998-06-24 | 2011-03-08 | Katzer Matthew A | Model train control system |
US20080065284A1 (en) * | 1998-06-24 | 2008-03-13 | Katzer Matthew A | Model train control system |
US20080071435A1 (en) * | 1998-06-24 | 2008-03-20 | Katzer Matthew A | Model train control system |
US7856296B2 (en) | 1998-06-24 | 2010-12-21 | Katzer Matthew A | Model train control system |
US7818102B2 (en) | 1998-06-24 | 2010-10-19 | Katzer Matthew A | Model train control system |
US7970504B2 (en) | 2000-04-03 | 2011-06-28 | Katzer Matthew A | Model train control system |
US7711458B2 (en) | 2000-04-03 | 2010-05-04 | Katzer Matthew A | Model train control system |
US9937431B2 (en) * | 2003-11-26 | 2018-04-10 | Lionel Llc | Model train control system |
US10434429B2 (en) * | 2003-11-26 | 2019-10-08 | Liontech Trains Llc | Model train control system |
US20150306512A1 (en) * | 2003-11-26 | 2015-10-29 | Lionel Llc | Model Train Control System |
US20150306515A1 (en) * | 2003-11-26 | 2015-10-29 | Lionel Llc | Model Train Control System |
US20060100753A1 (en) * | 2004-11-10 | 2006-05-11 | Katzer Matthew A | Model train control |
US20100145557A1 (en) * | 2004-11-10 | 2010-06-10 | Katzer Matthew A | Model train control |
US7885735B2 (en) * | 2004-11-10 | 2011-02-08 | Katzer Matthew A | Model train control |
US8606383B2 (en) | 2005-01-31 | 2013-12-10 | The Invention Science Fund I, Llc | Audio sharing |
US9124729B2 (en) | 2005-01-31 | 2015-09-01 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US8902320B2 (en) | 2005-01-31 | 2014-12-02 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US9489717B2 (en) | 2005-01-31 | 2016-11-08 | Invention Science Fund I, Llc | Shared image device |
US8988537B2 (en) | 2005-01-31 | 2015-03-24 | The Invention Science Fund I, Llc | Shared image devices |
US9019383B2 (en) | 2005-01-31 | 2015-04-28 | The Invention Science Fund I, Llc | Shared image devices |
US9082456B2 (en) | 2005-01-31 | 2015-07-14 | The Invention Science Fund I Llc | Shared image device designation |
US9910341B2 (en) | 2005-01-31 | 2018-03-06 | The Invention Science Fund I, Llc | Shared image device designation |
US7917345B2 (en) * | 2005-04-07 | 2011-03-29 | New York Air Brake Corporation | Multimedia train simulator |
US20090125287A1 (en) * | 2005-04-07 | 2009-05-14 | New York Air Brake Corporation | Multimedia Train Simulator |
US10003762B2 (en) | 2005-04-26 | 2018-06-19 | Invention Science Fund I, Llc | Shared image devices |
US9819490B2 (en) | 2005-05-04 | 2017-11-14 | Invention Science Fund I, Llc | Regional proximity for shared image device(s) |
US9041826B2 (en) | 2005-06-02 | 2015-05-26 | The Invention Science Fund I, Llc | Capturing selected image objects |
US9191611B2 (en) | 2005-06-02 | 2015-11-17 | Invention Science Fund I, Llc | Conditional alteration of a saved image |
US9967424B2 (en) | 2005-06-02 | 2018-05-08 | Invention Science Fund I, Llc | Data storage usage protocol |
US9451200B2 (en) | 2005-06-02 | 2016-09-20 | Invention Science Fund I, Llc | Storage access technique for captured data |
US9621749B2 (en) | 2005-06-02 | 2017-04-11 | Invention Science Fund I, Llc | Capturing selected image objects |
US9001215B2 (en) | 2005-06-02 | 2015-04-07 | The Invention Science Fund I, Llc | Estimating shared image device operational capabilities or resources |
US8681225B2 (en) | 2005-06-02 | 2014-03-25 | Royce A. Levien | Storage access technique for captured data |
US10097756B2 (en) | 2005-06-02 | 2018-10-09 | Invention Science Fund I, Llc | Enhanced video/still image correlation |
US20070052856A1 (en) * | 2005-06-02 | 2007-03-08 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware. | Composite image selectivity |
US20070078574A1 (en) * | 2005-09-30 | 2007-04-05 | Davenport David M | System and method for providing access to wireless railroad data network |
US20070076750A1 (en) * | 2005-09-30 | 2007-04-05 | Microsoft Corporation | Device driver interface architecture |
US9942511B2 (en) | 2005-10-31 | 2018-04-10 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US8804033B2 (en) | 2005-10-31 | 2014-08-12 | The Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US9167195B2 (en) | 2005-10-31 | 2015-10-20 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US20090162814A1 (en) * | 2005-12-06 | 2009-06-25 | Andrew Warburton Swan | Video-captured model vehicle simulator |
US8333592B2 (en) * | 2005-12-06 | 2012-12-18 | Andrew Warburton Swan | Video-captured model vehicle simulator |
US9076208B2 (en) | 2006-02-28 | 2015-07-07 | The Invention Science Fund I, Llc | Imagery processing |
US9093121B2 (en) | 2006-02-28 | 2015-07-28 | The Invention Science Fund I, Llc | Data management of an audio data stream |
US8964054B2 (en) | 2006-08-18 | 2015-02-24 | The Invention Science Fund I, Llc | Capturing selected image objects |
US20110109549A1 (en) * | 2007-04-24 | 2011-05-12 | Irobot Corporation | Control System for a Remote Vehicle |
US8760397B2 (en) | 2007-04-24 | 2014-06-24 | Irobot Corporation | Control system for a remote vehicle |
US8350810B2 (en) | 2007-04-24 | 2013-01-08 | Irobot Corporation | Control system for a remote vehicle |
US9195256B2 (en) | 2007-04-24 | 2015-11-24 | Irobot Corporation | Control system for a remote vehicle |
US8199109B2 (en) * | 2007-04-24 | 2012-06-12 | Irobot Corporation | Control system for a remote vehicle |
US20090054768A1 (en) * | 2007-08-24 | 2009-02-26 | Menachem Halmann | Method and apparatus for voice recording with ultrasound imaging |
US9561015B2 (en) * | 2007-08-24 | 2017-02-07 | General Electric Company | Method and apparatus for voice recording with ultrasound imaging |
US20090177608A1 (en) * | 2008-01-08 | 2009-07-09 | Robert Lee Angell | Risk assessment in a gate area of an airport |
US7885909B2 (en) | 2008-01-09 | 2011-02-08 | International Business Machines Corporation | Risk assessment between airports |
US20090177606A1 (en) * | 2008-01-09 | 2009-07-09 | Robert Lee Angell | Risk assessment in a pre/post security area within an airport |
US20090177615A1 (en) * | 2008-01-09 | 2009-07-09 | Robert Lee Angell | Risk assessment between airports |
US20090177605A1 (en) * | 2008-01-09 | 2009-07-09 | Robert Lee Angell | Risk assessment within an aircraft |
US20090177609A1 (en) * | 2008-01-09 | 2009-07-09 | Robert Lee Angell | Risk assessment in an area external to an airport |
US7870085B2 (en) | 2008-01-09 | 2011-01-11 | International Business Machines Corporation | Risk assessment between aircrafts |
US7895143B2 (en) | 2008-01-09 | 2011-02-22 | International Business Machines Corporation | Risk assessment in an area external to an airport |
US7895144B2 (en) | 2008-01-09 | 2011-02-22 | International Business Machines Corporation | Risk assessment in a pre/post security area within an airport |
US9274601B2 (en) * | 2008-04-24 | 2016-03-01 | Blackberry Limited | System and method for generating a feedback signal in response to an input signal provided to an electronic device |
US20090267920A1 (en) * | 2008-04-24 | 2009-10-29 | Research In Motion Limited | System and method for generating a feedback signal in response to an input signal provided to an electronic device |
US20100293462A1 (en) * | 2008-05-13 | 2010-11-18 | Apple Inc. | Pushing a user interface to a remote device |
US9176651B2 (en) | 2008-05-13 | 2015-11-03 | Apple Inc. | Pushing a user interface to a remote device |
US9285968B2 (en) | 2008-05-13 | 2016-03-15 | Apple Inc. | User interface including content from a remote device |
US9335907B2 (en) | 2008-05-13 | 2016-05-10 | Apple Inc. | User interface including content from an accessory |
US20110265003A1 (en) * | 2008-05-13 | 2011-10-27 | Apple Inc. | Pushing a user interface to a remote device |
US20090284476A1 (en) * | 2008-05-13 | 2009-11-19 | Apple Inc. | Pushing a user interface to a remote device |
US20110145863A1 (en) * | 2008-05-13 | 2011-06-16 | Apple Inc. | Pushing a graphical user interface to a remote device with display rules provided by the remote device |
US9870130B2 (en) * | 2008-05-13 | 2018-01-16 | Apple Inc. | Pushing a user interface to a remote device |
US9471207B2 (en) | 2008-05-13 | 2016-10-18 | Apple Inc. | Pushing a user interface to a remote device that controls multiple displays |
US8970647B2 (en) | 2008-05-13 | 2015-03-03 | Apple Inc. | Pushing a graphical user interface to a remote device with display rules provided by the remote device |
US9311115B2 (en) | 2008-05-13 | 2016-04-12 | Apple Inc. | Pushing a graphical user interface to a remote device with display rules provided by the remote device |
US9875006B2 (en) | 2008-05-13 | 2018-01-23 | Apple Inc. | Pushing a graphical user interface to a remote device with display rules provided by the remote device |
US8237583B2 (en) | 2008-11-05 | 2012-08-07 | General Electric Company | Method and system for vital display systems |
US20100109899A1 (en) * | 2008-11-05 | 2010-05-06 | Michael Scott Mitchell | Method and system for vital display systems |
US20110261201A1 (en) * | 2008-11-25 | 2011-10-27 | Toyota Jidosha Kabushiki Kaisha | Input device, vehicle environment monitoring apparatus, icon switch selection method, and recording medium |
US9041804B2 (en) * | 2008-11-25 | 2015-05-26 | Aisin Seiki Kabushiki Kaisha | Input device, vehicle environment monitoring apparatus, icon switch selection method, and recording medium |
US20100135641A1 (en) * | 2008-12-03 | 2010-06-03 | D-Box Technologies Inc. | Method and device for encoding vibro-kinetic data onto an lpcm audio stream over an hdmi link |
WO2010063108A1 (en) * | 2008-12-03 | 2010-06-10 | D-Box Technologies Inc. | Method and device for encoding vibro-kinetic data onto an lpcm audio stream over an hdmi link |
US8515239B2 (en) | 2008-12-03 | 2013-08-20 | D-Box Technologies Inc. | Method and device for encoding vibro-kinetic data onto an LPCM audio stream over an HDMI link |
US10007340B2 (en) | 2009-03-12 | 2018-06-26 | Immersion Corporation | Systems and methods for interfaces featuring surface-based haptic effects |
US10198077B2 (en) * | 2009-03-12 | 2019-02-05 | Immersion Corporation | Systems and methods for a texture engine |
US10747322B2 (en) | 2009-03-12 | 2020-08-18 | Immersion Corporation | Systems and methods for providing features in a friction display |
US10620707B2 (en) | 2009-03-12 | 2020-04-14 | Immersion Corporation | Systems and methods for interfaces featuring surface-based haptic effects |
US10564721B2 (en) | 2009-03-12 | 2020-02-18 | Immersion Corporation | Systems and methods for using multiple actuators to realize textures |
US10466792B2 (en) | 2009-03-12 | 2019-11-05 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
US20100231540A1 (en) * | 2009-03-12 | 2010-09-16 | Immersion Corporation | Systems and Methods For A Texture Engine |
US10379618B2 (en) | 2009-03-12 | 2019-08-13 | Immersion Corporation | Systems and methods for using textures in graphical user interface widgets |
US9874935B2 (en) * | 2009-03-12 | 2018-01-23 | Immersion Corporation | Systems and methods for a texture engine |
US10248213B2 (en) | 2009-03-12 | 2019-04-02 | Immersion Corporation | Systems and methods for interfaces featuring surface-based haptic effects |
US20180011539A1 (en) * | 2009-03-12 | 2018-01-11 | Immersion Corporation | Systems and Methods for a Texture Engine |
US9927873B2 (en) | 2009-03-12 | 2018-03-27 | Immersion Corporation | Systems and methods for using textures in graphical user interface widgets |
US9746923B2 (en) | 2009-03-12 | 2017-08-29 | Immersion Corporation | Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction |
US10073526B2 (en) | 2009-03-12 | 2018-09-11 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
US10073527B2 (en) | 2009-03-12 | 2018-09-11 | Immersion Corporation | Systems and methods for providing features in a friction display including a haptic effect based on a color and a degree of shading |
US9696803B2 (en) | 2009-03-12 | 2017-07-04 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
CN102034341B (en) * | 2009-09-30 | 2012-12-12 | 金宝电子工业股份有限公司 | Control system and method for generating control picture by using control system |
FR2957266A1 (en) * | 2010-03-11 | 2011-09-16 | Parrot | METHOD AND APPARATUS FOR REMOTE CONTROL OF A DRONE, IN PARTICULAR A ROTARY SAILING DRONE. |
FR2957265A1 (en) * | 2010-03-11 | 2011-09-16 | Parrot | Method for implementing remote control apparatus e.g. Iphone type cellular telephone used to remotely control helicopter, involves activating control command based on analysis results |
US20110221692A1 (en) * | 2010-03-11 | 2011-09-15 | Parrot | Method and an appliance for remotely controlling a drone, in particular a rotary wing drone |
CN102266672A (en) * | 2010-03-11 | 2011-12-07 | 鹦鹉股份有限公司 | Method and device for remote control of a drone, in particular a rotary-wing drone |
EP2364757A1 (en) * | 2010-03-11 | 2011-09-14 | Parrot | Method and device for remote control of a drone, in particular a rotary-wing drone |
US8958928B2 (en) * | 2010-03-11 | 2015-02-17 | Parrot | Method and an appliance for remotely controlling a drone, in particular a rotary wing drone |
US20110296340A1 (en) * | 2010-05-31 | 2011-12-01 | Denso Corporation | In-vehicle input system |
US9555707B2 (en) * | 2010-05-31 | 2017-01-31 | Denso Corporation | In-vehicle input system |
US9330507B2 (en) | 2010-08-18 | 2016-05-03 | Snap-On Incorporated | System and method for selecting individual parameters to transition from text-to-graph or graph-to-text |
US8560168B2 (en) | 2010-08-18 | 2013-10-15 | Snap-On Incorporated | System and method for extending communication range and reducing power consumption of vehicle diagnostic equipment |
US9117321B2 (en) | 2010-08-18 | 2015-08-25 | Snap-On Incorporated | Method and apparatus to use remote and local control modes to acquire and visually present data |
US8983785B2 (en) | 2010-08-18 | 2015-03-17 | Snap-On Incorporated | System and method for simultaneous display of waveforms generated from input signals received at a data acquisition device |
US8463953B2 (en) | 2010-08-18 | 2013-06-11 | Snap-On Incorporated | System and method for integrating devices for servicing a device-under-service |
US8754779B2 (en) | 2010-08-18 | 2014-06-17 | Snap-On Incorporated | System and method for displaying input data on a remote display device |
US8935440B2 (en) | 2010-08-18 | 2015-01-13 | Snap-On Incorporated | System and method for integrating devices for servicing a device-under-service |
US9304062B2 (en) | 2010-08-18 | 2016-04-05 | Snap-On Incorporated | System and method for extending communication range and reducing power consumption of vehicle diagnostic equipment |
US9633492B2 (en) | 2010-08-18 | 2017-04-25 | Snap-On Incorporated | System and method for a vehicle scanner to automatically execute a test suite from a storage card |
US20190259080A1 (en) * | 2011-10-17 | 2019-08-22 | Johnson Controls Technology Company | Battery selection and feedback system and method |
US20160194041A1 (en) * | 2011-12-16 | 2016-07-07 | Entro Industries, Inc. | Control System for Load Transportation Device |
US10787212B2 (en) * | 2011-12-16 | 2020-09-29 | Entro Industries, Inc. | Control system for load transportation device |
USD786887S1 (en) * | 2013-04-19 | 2017-05-16 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD815658S1 (en) | 2013-04-19 | 2018-04-17 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
US20140360399A1 (en) * | 2013-06-11 | 2014-12-11 | BlueRail Trains LLC | Wireless model railroad control system |
US9789612B2 (en) | 2014-01-07 | 2017-10-17 | Irobot Defense Holdings, Inc. | Remotely operating a mobile robot |
US9283674B2 (en) | 2014-01-07 | 2016-03-15 | Irobot Corporation | Remotely operating a mobile robot |
US9592604B2 (en) | 2014-01-07 | 2017-03-14 | Irobot Defense Holdings, Inc. | Remotely operating a mobile robot |
US11262899B2 (en) * | 2014-09-10 | 2022-03-01 | Lego A/S | Method for establishing a functional relationship between input and output functions |
US10249088B2 (en) * | 2014-11-20 | 2019-04-02 | Honda Motor Co., Ltd. | System and method for remote virtual reality control of movable vehicle partitions |
US20160147233A1 (en) * | 2014-11-20 | 2016-05-26 | Honda Motor Co., Ltd. | System and method for remote virtual reality control of movable vehicle partitions |
US9994243B1 (en) * | 2017-07-05 | 2018-06-12 | Siemens Industry, Inc. | Clear enclosure top dome for end of train device |
US10652719B2 (en) | 2017-10-26 | 2020-05-12 | Mattel, Inc. | Toy vehicle accessory and related system |
US11471783B2 (en) | 2019-04-16 | 2022-10-18 | Mattel, Inc. | Toy vehicle track system |
US11964215B2 (en) | 2019-04-16 | 2024-04-23 | Mattel, Inc. | Toy vehicle track system |
CN112269385A (en) * | 2020-10-23 | 2021-01-26 | 北京理工大学 | Cloud unmanned vehicle dynamics control system and method |
US20230280890A1 (en) * | 2022-03-04 | 2023-09-07 | Ford Global Technologies, Llc | Methods and systems for vehicle interface control |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060226298A1 (en) | Graphical method and system for model vehicle and accessory control | |
CN101482790B (en) | Electronic device capable of transferring object between two display elements and its control method | |
AU2009246654B2 (en) | Pushing a user interface to a remote device | |
CN101645197A (en) | Touchable multifunctional remote controller and touchable multifunctional remote control method | |
JP5652432B2 (en) | Vehicle control device | |
US20110128446A1 (en) | Car audio/video terminal system having user interface linkable with portable device and method for linking the same | |
US20140173155A1 (en) | Mobile device dock | |
CN104396282A (en) | Configuration interface for a programmable multimedia controller | |
CN103703495A (en) | Remote control device, information processing method and system | |
CN103024566A (en) | Television interface operation control apparatus and remote control device and method thereof | |
US20130181915A1 (en) | Touch display, computer system having a touch display, and method of switching modes of a touch display | |
KR100679634B1 (en) | Educational system for drawing up icon-based robot control program and its method | |
JPWO2014112080A1 (en) | Operating device | |
US20130201126A1 (en) | Input device | |
KR20170009302A (en) | Display apparatus and control method thereof | |
JP2010182134A (en) | Remote control system, control terminal device, and selection status display alternation method | |
KR20150045439A (en) | A method and device for controlling a display device | |
US20030051631A1 (en) | Method of and an apparatus for using a graphical handheld computer for model railroad programming and control | |
KR20100086570A (en) | Integrated remote control device, integrated remote control method and storage medium of storing program for executing the same | |
EP1976326B1 (en) | Apparatus using remote control signal, car navigation apparatus, and display | |
KR101682527B1 (en) | touch keypad combined mouse using thin type haptic module | |
CN100355223C (en) | Remote control apparatus and control method thereof | |
CN112799555A (en) | Screen cleaning control method, touch display device and electronic device | |
TW201427401A (en) | Television, remote controller and menu displaying method | |
JP5027084B2 (en) | Input device and input method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LIONEL L.L.C., MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PIERSON, MARTIN;REEL/FRAME:016741/0396 Effective date: 20050706 |
|
AS | Assignment |
Owner name: WACHOVIA BANK, NATIONAL ASSOCIATION, NEW YORK Free format text: AMENDED AND RESTATED PATENT COLLATERAL ASSIGNMENT AND SECURITY AGREEMENT;ASSIGNOR:LIONEL L.L.C.;REEL/FRAME:020909/0942 Effective date: 20080501 Owner name: WACHOVIA BANK, NATIONAL ASSOCIATION,NEW YORK Free format text: AMENDED AND RESTATED PATENT COLLATERAL ASSIGNMENT AND SECURITY AGREEMENT;ASSIGNOR:LIONEL L.L.C.;REEL/FRAME:020909/0942 Effective date: 20080501 |
|
AS | Assignment |
Owner name: GUGGENHEIM CORPORATE FUNDING, LLC, NEW YORK Free format text: SHORT FORM PATENT SECURITY AGREEMENT;ASSIGNOR:LIONEL L.L.C.;REEL/FRAME:020951/0794 Effective date: 20080501 Owner name: GUGGENHEIM CORPORATE FUNDING, LLC,NEW YORK Free format text: SHORT FORM PATENT SECURITY AGREEMENT;ASSIGNOR:LIONEL L.L.C.;REEL/FRAME:020951/0794 Effective date: 20080501 |
|
AS | Assignment |
Owner name: GUGGENHEIM CORPORATE FUNDING, LLC, NEW YORK Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE SIGNATURE PAGES TO THE SHORT FORM PATENT SECURITY AGREEMENT PREVIOUSLY RECORDED ON REEL 020951 FRAME 0794. ASSIGNOR(S) HEREBY CONFIRMS THE SHORT FORM PATENT SECURITY AGREEMENT.;ASSIGNOR:LIONEL L.L.C.;REEL/FRAME:021029/0775 Effective date: 20080501 Owner name: GUGGENHEIM CORPORATE FUNDING, LLC, NEW YORK Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE SIGNATURE PAGES TO THE SHORT FORM PATENT SECURITY AGREEMENT PREVIOUSLY RECORDED ON REEL 020951 FRAME 0794;ASSIGNOR:LIONEL L.L.C.;REEL/FRAME:021029/0775 Effective date: 20080501 Owner name: GUGGENHEIM CORPORATE FUNDING, LLC,NEW YORK Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE SIGNATURE PAGES TO THE SHORT FORM PATENT SECURITY AGREEMENT PREVIOUSLY RECORDED ON REEL 020951 FRAME 0794. ASSIGNOR(S) HEREBY CONFIRMS THE SHORT FORM PATENT SECURITY AGREEMENT;ASSIGNOR:LIONEL L.L.C.;REEL/FRAME:021029/0775 Effective date: 20080501 Owner name: GUGGENHEIM CORPORATE FUNDING, LLC, NEW YORK Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE SIGNATURE PAGES TO THE SHORT FORM PATENT SECURITY AGREEMENT PREVIOUSLY RECORDED ON REEL 020951 FRAME 0794. ASSIGNOR(S) HEREBY CONFIRMS THE SHORT FORM PATENT SECURITY AGREEMENT;ASSIGNOR:LIONEL L.L.C.;REEL/FRAME:021029/0775 Effective date: 20080501 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: LIONEL L.L.C., NORTH CAROLINA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:GUGGENHEIM CREDIT SERVICES, LLC (SUCCESSOR IN INTEREST TO GUGGENHEIM CORPORATE FUNDING, LLC), AS AGENT;REEL/FRAME:054246/0651 Effective date: 20201026 |