WO2015015385A1 - System and method for guiding the placement of a sensor - Google Patents
System and method for guiding the placement of a sensor Download PDFInfo
- Publication number
- WO2015015385A1 WO2015015385A1 PCT/IB2014/063402 IB2014063402W WO2015015385A1 WO 2015015385 A1 WO2015015385 A1 WO 2015015385A1 IB 2014063402 W IB2014063402 W IB 2014063402W WO 2015015385 A1 WO2015015385 A1 WO 2015015385A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- location
- subject
- marker
- unit
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 239000003550 marker Substances 0.000 claims abstract description 69
- 230000005236 sound signal Effects 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 6
- 239000000463 material Substances 0.000 claims description 3
- 238000002567 electromyography Methods 0.000 description 34
- 230000033001 locomotion Effects 0.000 description 6
- 210000003205 muscle Anatomy 0.000 description 6
- 210000002659 acromion Anatomy 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 2
- 238000002565 electrocardiography Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 210000001991 scapula Anatomy 0.000 description 2
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000001727 in vivo Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 210000004417 patella Anatomy 0.000 description 1
- 210000003314 quadriceps muscle Anatomy 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/684—Indicating the position of the sensor on the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1127—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6824—Arm or wrist
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
- A61B2090/395—Visible markers with marking agent for marking skin or other tissue
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
The present invention provides a system and a method for guiding the placement of a sensor. A system for guiding the placement of the sensor comprises an infrared camera configured to acquire an image of the subject; a first unit (11) for identifying at least one marker in the acquired image;the at least one marker (A, B) is retro-reflective marker and each of the at least one marker is attached to an anatomical location of the subject; and a second unit (12) for generating a control signal according to the location(s) of the identified at least one marker in the image and predefined criteria, wherein the predefined criteria define the spacial relationship between the anatomical location(s) of the subject and a first location of the subject where the sensor is to be placed; the second unit is configured to control a third unit (13) to provide an indication signal indicative of a first location where the sensor is to be placed based on the control signal. According to the system, repeatable, fast and convenient placement of the sensor is achieved.
Description
System and method for guiding the placement of a sensor
FIELD OF THE INVENTION
The present invention generally relates to the placement of a sensor, especially to the placement of a medical sensor on a region of interest of a subject.
BACKGROUND OF THE INVENTION
In the field of medicine, various sensors may be placed on a subject to detect physiological parameters of the subject. The sensors include, but are not limited to, electrocardiography (ECG) sensors, electromyography (EMG) sensors, electroencephalogram (EEG) sensors and ultrasonic sensors. A major factor affecting the quality of the detected physiological parameters for such sensors is sensor placement on the subject.
Generally, in order to obtain a high-quality physiological signal from a region of interest of the subject, it is important to place a sensor in a desired location on the region of interest so that a physiological signal of optimal quality may be obtained from said desired location. For example in the case of an EMG sensor, different methods have been proposed to detect a desired location for the EMG sensor.
A straight way to detect the desired location of the EMG sensor is manual measurement using a ruler with reference to certain anatomical landmarks in the region of interest of the subject according to some well-accepted guidelines for sensor placement such as the SENIAM (surface EMG for non-invasive assessment of muscles) recommendations. As the method is carried out manually, it is not only time consuming but also difficult to perform. Besides, the detected sensor locations tend to have considerable variability across sessions or between practitioners due to measurement error, and therefore repeatable placement of an EMG sensor across different sessions or by different practitioners may be difficult to achieve.
Another method detects the desired location of the EMG sensor on the region of interest of the subject in dependence on the motion of the subject. In particular, the
method detects as the desired location of the EMG sensor a location on the region of interest of a subject where the obtained EMG signal has the largest signal-to-noise ratio during motion of the subject. As the method has to be performed during motion of the subject and requires multiple measurements for a plurality of locations on the region of interest of the subject, it is time-consuming and cannot be used for applications where the subject is immobile or not supposed to perform active movement.
US 2009/0118610 Al proposes a MRI localization and/or guidance system for facilitating placement of an interventional therapy and/or device in vivo. The MRI system is an expensive healthcare device, the guidance for the placement of an interventional device using a MRI system may be feasible in a complicated surgery. However, it is not possible to request MRI guidance for the placement of a sensor in a normal quick physical exam.
SUMMARY OF THE INVENTION
Therefore, it would be desirable to provide a system and a method for achieving repeatable, fast and convenient placement of a sensor on a region of interest of a subject, which overcomes the above problems.
According to the present invention, the desired location where the sensor is to be placed on a subject is automatically determined relative to the anatomical location(s) of the subject according to pre-defined criteria which define the spatial relationship between the desired location of the subject and the anatomical location(s) of the subject. Said automatic determination instead of manual measurement for the placement of the sensor is fast and convenient. Moreover, it reduces measurement errors and thus enables repeatable placement of the sensor on the same location the subject across sessions or by different practitioners.
In addition, according to the present invention, the desired location where the sensor is to be placed may be determined while the subject keeps still. Therefore, it is applicable even when the examinee is immobile or not supposed to perform active movement.
In one aspect, the present invention provides a system for guiding the placement of a sensor. The system comprises: an image acquisition unit, such as an infrared camera, configured to acquire an image of the subject; a first unit configured to identify at least one marker in the acquired image, wherein the at least one marker is retro-reflective marker and each of the at least one marker is attached to an anatomical location of the subject. The system further comprises a second unit configured to generate a control signal according to the location(s) of the identified at least one retro-reflective marker in the image and predefined criteria, wherein the predefined criteria define the spacial relationship between the anatomical location(s) of the subject and a first location of the subject where the sensor is to be placed. The system further comprise a third unit configured to provide an indication signal indicative of the first location of the subject, wherein the second unit is further configured to control, based on the control signal, the third unit to provide the indication signal indicative of the first location of the subject for guiding the placement of the sensor.
In another aspect, the present invention provides a method of guiding the placement of a sensor. According to the method, an image of an subject is acquired by an infrared camera, and at least one marker is identified in the image, wherein the at least one marker (A, B) is retro-reflective marker and each of the at least one marker is attached to an anatomical location of the subject; a control signal is generated according to the location(s) of the identified at least one retro-reflective marker in the image and predefined criteria, wherein the predefined criteria define the spacial relationship between the anatomical location(s) of the subject and a first location of the subject where the sensor is to be placed; and an indication signal indicative of the first location of the subject is provided based on the control signal.
In a further aspect, the present invention provides a computer program comprising computer program code means for causing a computer to perform the steps of the method as described above according to the present invention.
Various aspects and features of the disclosure are described in further detail below. And other objects and advantages of the present invention will become more apparent and will be easily understood with reference to the description made in
combination with the accompanying drawings.
DESCRIPTION OF THE DRAWINGS
The present invention will be described and explained hereinafter in more detail in combination with embodiments and with reference to the drawings, wherein:
Fig. 1 is a schematic block diagram of a system for guiding the placement of an EMG sensor on a subject according to one embodiment of the present invention;
Fig. 2 shows a configuration of one or more markers on the subject, i.e., a patient;
Fig. 3 is a schematic block diagram of a system for guiding the placement of an EMG sensor on a subject according to another embodiment of the present invention;
Fig. 4 is a schematic block diagram of a system for guiding the placement of an EMG sensor on a subject according to a further embodiment of the present invention;
Fig. 5 is a flowchart for guiding the placement of an EMG sensor on a subject according to the method of the invention;
The same reference signs in the figures indicate similar or corresponding features and/or functionalities.
DETAILED DESCRIPTION
The present invention will be described with respect to particular embodiments and with reference to certain drawings, but the invention is not limited thereto but only by the claims. The drawings described are only schematic and are non-limiting. In the drawings, the size of some of the elements may be exaggerated and not drawn to scale for illustrative purposes.
Embodiments of the invention will be described with reference to an EMG sensor in the following part of the document. However, for those skilled in the art, it may be easy to apply the embodiments to other types of sensors which need to be located on a certain position of a subject, including but not limited to ECG sensors, EEG sensors, ultrasonic sensors, etc.
Fig. 1 shows, according to one embodiment of the present invention, a schematic
block diagram of a system 10 for guiding the placement of an EMG sensor on a subject, e.g., an upper arm of a patient, to obtain EMG signals from a muscle of interest.
Fig.2 shows a configuration of the marker(s) on the subject, i.e., an upper arm of a patient. In this case, a desired location to place an EMG sensor for the assessment of the biceps brachii of a patient is indicated according to the present invention.
System 10 comprises an image acquisition unit, such as an infrared camera, for acquiring images of the subject. The image acquisition unit could also be other type of the camera, such as a digital camera. The system 10 further comprises a first unit 11 which receives images of the subject and identifies at least one marker (i.e. marker A and marker B as shown in Fig.2) in the acquired image. The at least one marker is attached to an anatomical location of the subject. For example, in Fig.2, the marker A and the marker B have been attached on the medial acromion and the cubital fossa, respectively. The first unit 11 determines one or more locations of the at least one marker in the image, for example, the first unit 11 determines the coordinates of the one marker A and marker B in the image.
Advantageously, the at least one marker is an entity marker (i.e. a physical marker) which may be attached directly to one or more anatomical locations of the subject or to one or more corresponding parts of a training garment to be worn by the subject before imaging the subject. The marker may also be part of a garment. In case of attaching the marker to an garment or the marker being part of the garment, the marker and the garment is designed such that when the subject wears the garment, each of the at least one marker is attached to (i.e. cover) a desired anatomical location. When capturing the image of the subject, the at least one marker may be displayed and identified in the image of the subject.
Advantageously, the at least one marker is of the retro-reflective type reflecting the light with a specific wavelength and the image acquisition unit may receive the reflected light with the specific wavelength and obtain the image of the markers which reflect light with the specific wavelength. For example, the one or more retro-reflective markers are markers which reflect infrared light only and the image
acquisition unit is an infrared camera to obtain an infrared image. By using retro-reflective markers and the infrared camera, the image will only show the markers because only the marker can reflect certain infrared light, the other body part of garment part cannot be captured in the image. In such a way, the computation process according to the invention may be substantially simplified as only the retro-reflective markers can be displayed on the image.
The anatomical locations to be selected for different muscles of the subject to be detected by EMG may be different from one another. In particular, the anatomical location(s) to be indicated by the at least one marker may be selected for different muscles from certain guidelines for desired placement of the EMG sensor. The guidelines comprise currently well-accepted recommendations - SENIAM (surface electromyography for non-invasive assessment of muscles recommendations). For example, for biceps brachii, the anatomical location(s) to be indicated by the at least one marker may be selected to be the medial acromion and the cubital fossa. Therefore, the markers may be positioned on the medial acromion and the cubital fossa, respectively.
Also, the number of anatomical locations to be indicated by the markers is different for different muscles to be detected by EMG and not limited to two locations as described above. For example, for vastus medialis oblique, only one anatomical location may be selected as superior rim of the patella, and for scapula, three anatomical locations are to be selected: one on the inferior tip of the scapula, one on the spine and another along the lateral edge of the torso.
Returning to Fig.1, the system 10 also comprises a second unit 12. The second unit 12 receives, from the first unit 11, the determined one or more locations, e.g., the coordinates, of the identified at least one marker in the image and generates a control signal according to the received location(s) of the identified at least one marker in the image and predefined criteria. The predefined criteria define the spacial relationship between the anatomical location(s) of the subject and a desired location of the subject (hereinafter referred to as "first location") where the sensor is to be placed.
The system further comprises a third unit 13 which is capable of providing an
indication signal. The indication signal is for indicating the desired first location where the EMG sensor is to be placed on the subject. The second unit 12 provides the generated control signal to control the third unit 13 to provide the indication signal based on the control signal. . According to the indication signal provided by the third unit 13 a user may conveniently and repeatably position the EMG sensor on the desired location of the subject.
The predefined criteria may also be selected from the guidelines for optimal placement of the EMG sensor, as described above, and be pre-stored in a memory comprised in the system or in a remote memory. Generally, the predefined criteria define the relationship between the first location of the subject and the anatomical location(s) of a subject indicated by the at least one marker attached on the anatomical location(s). For example, according to the predefined criteria for biceps brachii, the EMG sensor is to be placed on the line between the medial acromion (marker A) and the cubital fossa (marker B) at 1/3 of the distance from the cubital fossa to the medial acromion, as marked with a cross in Fig.2.
Fig.3 is schematic block diagram of a system 10A for guiding the placement of an EMG sensor on a subject according to another embodiment of the present invention. Optionally, the second unit 12 in Fig.3 comprises a fourth unit 14 which receives the determined one or more locations of the at least one marker in the image of the subject from the first unit 11 as shown in Fig. l and the fourth unit 14 then determines one or more location parameters in accordance with the location of the at least one marker in the image and the criteria define the spacial relationship between the first locaiton and the at least one marker. The one or more location parameters determine a location in the image. The determined location satisfies the relation defined by the predefined criteria and therefore it reflects the first location of the subject. The location parameters may not be limited to distance between the marker(s) and the first location. Angles, orientations, azimuth relative to the marker(s) and so on may be used also.
Particularly, in the fourth unit 14, the one or more location parameters in the image may be further converted to the location parameters which represents the first
location of the subject, and then the second unit 12 generates a control signal to control the third unit 13 to provide the indication signal according to the actual first location of the subject. To generate the control signal, the second unit 12 may also need the information such as the disctance between the user and the camera, the physical distance between the at least two markers, etc. These information may be used to map the determined location parameters of the image to the first location on the body of the subject. As is well known in the art, the conversion of the location parameter(s)in the image into the actual location parameters on the subject may be based on coordination transformation between different reference systems according to the principle of perspective projection.
Fig.4 shows a system 10B according to a further embodiment of the present invention. According to the embodiment, the third unit 13 comprises a fifth unit 15.
The second unit 12 receives the determined one or more locations of the identified at least one marker in the image from the first unit 11 and generates a control signal based on the received locations and the predefined criteria. The predefined criteria, as described above, may be received from a local or remote memory in which the predefined criteria are pre-stored.
The control signal generated by the second unit 12 is transmitted to the fifth unit 15. The fifth unit 15 receives the control signal from the second unit 12. The fifth unit 15, being controlled by the control signal, enables the third unit 13 to provide the indication signal indicative of the first location of the subject to the subject.
In one embodiment, the fifth unit 15 may be implemented as an actuating device, such as a servomotor. Except the fifth unit 15, the third unit 13 may also comprise a light source 16, such as a laser pointer. The servomotor is connected to the laser pointer such that the laser pointer can be actuated to project a visible light (the indication signal) to the first location of the subject. In this example, the indication signal provided by the third unit 13is a visible light signal. In response to receiving the control signal from the second unit 12, the servomotor may be controlled to adjust the pitch and yaw of the laser pointer to provide the visible light to the first location of the subject. Certainly, the light source 16 could also be any other types, such as LED,
which may also provide the visible light signal to the first location of the subject.
Although the fifth unit 15 and the light source 16 are described with respect to the system 10B, it is clear that the fifth unit 15 and the light source 16 may also be used for the system 10 or 1 OA as shown in Figs. 1 or 3.
According to the embodiment, the location where the EMG sensor is to be placed may be automatically indicated to the subject using a very cheap infrared camera and lights source. Therefore, a user may immediately and conveniently locate the EMG sensor on the indicated location with low cost.
In another embodiment, different from that shown in Fig. 4, an object may be configured to be directly close to the subject and to be moved from an original location (hereinafter referred to as "second location") to the first location. The object may be comprised in the system of the invention or disposed independently of the system.
Generally, the object may be moved by a user who aids the subject to place the sensor, such as a nurse, or a physician. The object can be imaged while the subject is being imaged. Particularly, the object may be a marking device, preferably a marking pen. Optionally, the object may be covered with a marker at one end. Preferably, the object may be covered with retro-reflective material at one end, similar to the above described retro-reflective marker, and in this case the object may be shown in the image and the other part of the subject may not be shown in the image.
During movement of the object, the first unit 11 of system 10 and 10A identifies not only the at least one marker of the region of interest in the image but also the moving object. The second unit 12 of the system 10 and 10A guides the user to move the object from the second location to the first location. The first location of the subject may be computed from the location(s) of the identified at least one marker in the image, based on the predefined criteria, by using perspective projection.
In one aspect, if it is determined that the current location of the identified object does not match the first location, the second unit 12 may provide a control signal to control the third unit 13 to generate an indication signal, which indication signal may be an audio signal. In this case, the third unit 13 may comprise a speaker. The audio
signal reflects the deviation between the first location and the current location of the identified object; such an audio signal may be used by the user for guiding the user in moving the object from the current location to the first location. The user further moves the object according to the guidance from the audio signal and marks the location as the first location when the audio signal indicates that the current location of the object matches the first location.
In another aspect, if it is determined that the current location of the identified object on the region of interest matches the first location on the region of interest, the user may mark the location directly in response to a corresponding audio signal. In the case where the object is a marking pen, marking may be done by means of the object directly. Otherwise, marking may be done by means of a marking device, such as a pen.
In addition, the object may comprise adhesive material at one end, and when the indication signal indicates that the object has been moved to the first location of the subject, the object may be attached to the first location of the subject by sticking it to the first location directly. Also, the object may be the EMG sensor itself, and when the audio signal (indication signal) indicates that the EMG sensor has been moved to the first location, the user attaches the EMG sensor to the first location.
Although, as described above, the embodiment is described with respect to the third unit 13 that generates the indication signal such as the audio signal, it may be understood that the third unit 13 may also be a display and the indication signal may be a real-time image of the subject displayed on the third unit 13; in the image, the location of the identified object and the desired location, i.e., the first location, where the EMG sensor is to be placed are shown. The user may move the object according to the displayed image until the location of the identified object coincides with the first location.
According to the embodiment, by moving an object with the guidance of the audio signal, a more cost-efficient means of providing the indication signal may be achieved.
Although the system of the invention is described with respect to a first unit, a
second unit, a third unit, a fourth unit, and a fifth unit, respectively, it may be determined that the system of the invention is not limited to the configurations described above. One or more units of the above units may be omitted or integrated into one unit to perform the same function. Furthermore, it may be determined that the individual units of the system of the invention may be achieved by any one of software, hardware, firmware or a combination thereof. In particular, the units of the application may be achieved not only by means of computer programs for performing corresponding functions but also by means of various entity devices, such as application-specific integrated circuits (ASIC), digital signal processors (DLP), programmable logic devices (PLD), field-programmable gate arrays (FPGA), CPUs and controllers.
Although, as shown in the drawings, the system of the invention is illustrated such that the third unit 13 and the object are not included therein, it may be clear that at least one of them may be also comprised in the system of the invention.
It may also be clear that the system of the invention may be commercially available as an individual device. However, the system of the invention may also be implemented as a rehabilitation system. In this case, the system further comprises a training garment configured to be worn by a subject, e.g., a patient, and an image acquisition unit configured to acquire an image of the subject.
In the system, one or more markers may be attached to the training garment at one or more locations which is attached to to one or more anatomical locations of the subject. The image acquisition unit, e.g., a video camera, acquires an image of the subject, e.g., the patient, when the subject wears the training garment, and the one or more markers are located at the corresponding anatomical locations of the subject. The first unit 11 of the system receives the images from the image acquisition unit and the second unit 12 of the system controls the third unit to provide an indication signal to the first location of the subject. As described above, it may be preferred that the markers are retro-reflective markers which optionally reflect the infrared light only and in this case the image acquisition unit may be an infrared camera that acquires the infrared image, wherein only the markers are visible in the acquired image.
Fig. 5 shows a flowchart for guiding the placement of a sensor according to the method of the invention. In step SI, an image of a region of interest of a subject, e.g., an upper arm of a patient, is acquired by an infrared camera. Before the image acquisition, the subject may be asked to assume a proper position and keep still, such that the region of interest of the subject may be imaged. Proper positioning of the subject to allow a specific region of interest to be detected may also be found in the guideline as described above. Since, according to the method, the placement of the EMG sensor may be guided when the subject keeps still, it may also be used in applications where the subject is either not able or not supposed to move. Furthermore the one or more markers may be attached to the corresponding anatomical locations of the region of interest of the subject prior to the image acquisition process.
In step S2, at least one marker is identified in the image of the subject. During step S2, the coordinates of the location of the identified one or more markers in the image may be determined. Preferably, an object to be moved from an initial location to the first location of the subject may also be identified in the image.
Preferably, in step S3, a control signal is generated, based on the location(s) of the identified at least one marker in the image and the predefined criteria, and transmitted to a fifth unit, i.e., an actuating device, being controlled by the control signal, to enable the third unit, i.e., an indication unit, to provide the indication signal for indicating the first location of the subject.
The first location is computed from the location(s) of the identified at least one marker in the image, based on predefined criteria, by using perspective projection. As described above, the predefined criteria define the relationship between the first location of the subject and the anatomical location(s) indicated by the at least one marker, the anatomical location may be selected according to certain guidelines for optimal placement of the EMG sensor.
Alternatively, in step S3, an indication signal may be an audio signal, wherein such an audio signal, for example, reflecting the deviation between the first location of the subject and the location of the object, may be used for guiding the object to be moved from the second location to the first location of the subject. The user moves
the object according to the indication signal. The invention further comprises a step of marking the first location with a marking device when the indication signal indicates that the object is moved to the first location.
In step S4, an indication signal is provided by the third unit based on the generated control signal to indicate a desired location (first location) to place the sensor on such desired location.
Please note that, as mentioned above, the system according to the present invention should not be limited to that mentioned above only. It will be apparent to those skilled in the art that the various aspects of the invention claimed may be practiced in other examples that depart from these specific details.
Furthermore, the mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention and that those skilled in the art would be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim or in the description. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. In the product claims enumerating several units, several of these units can be embodied by one and the same item of software and/or hardware. The usage of the words first, second and third, et cetera, does not indicate any ordering. These words are to be interpreted as names.
Claims
1. A system (10, 10A, 10B) for guiding the placement of a sensor, comprising:
- an infrared camera configured to acquire an image of a subject;
- a first unit (11) configured to identify at least one marker (A, B) in the acquired image, wherein the at least one marker (A, B) is retro-reflective marker and each of the at least one marker (A, B) is attached to an anatomical location of the subject;
- a second unit (12) configured to generate a control signal according to the location(s) of the identified at least one marker (A, B) in the image and predefined criteria, wherein the predefined criteria define the spacial relationship between the anatomical location(s) of the subject and a first location of the subject where the sensor is to be placed;
- a third unit (13) configured to provide an indication signal indicative of the first location of the subject;
wherein the second unit (12) is further configured to control the third unit (13) to provide the indication signal based on the control signal.
2. The system of claim 1, wherein the third unit (13) comprises an actuating device (15) and a light source (16), the actuating device (15) being controlled by the control signal to actuate the light source (16) to project a visible light as the indication signal for indicating the first location of the subject.
3. The system of claim 2, the actuating device (15) being connected with the light source (16) to adjust the pitch and yaw of the light source so as to project the visible light to the first location of the subject.
4. The system of claim 3, wherein the actuating device (15) comprises a servo motor and the light source (16) comprises a laser pointer.
5. The system of claim 1, wherein
the first unit (11) is further configured to identify an object in the image , said object being intended to be moved from a second location to the first location, and the second unit (12) is configured to generate the control signal further according to the location of the identified object in the image.
6. The system of claim 5, wherein the third unit (13) comprises an audio device, and the indication signal provided by the third unit (13) is an audio signal for guiding an object to be moved from the second location to the first location of the subject.
7. The system according to claim 5 or claim 6, wherein the system comprises the object, and at one end of the object is covered with retro-reflective material.
8. The system of claim 7, the object is a marking device; wherein when the object has been moved to the first location according to the indication signal, the object can be used to make a mark on the first location.
9. The system of claim 7, the object is the sensor itself, wherein the sensor will be attached to the first location directly when the object has been moved to the first location according to the indication signal.
10. The system of claim 1, further comprising:
- a garment configured to be worn by the subject; wherein each of the at least one marker is attached to or being part of the garment such that when the subject wears the garment, each of the at least one retro-reflective marker (A, B) is attached to an anatomical location of the subject.
11. A method of guiding the placement of a sensor, comprising:
- acquiring (SI) an image of an subject by an infrared camera;
- identifying (S2) at least one marker in the image, wherein the at least one marker (A, B) is retro-reflective marker and each of the at least one marker is attached
to an anatomical location of the subject;
- generating (S3) a control signal according to the location(s) of the identified at least one marker in the image and predefined criteria, wherein the predefined criteria define the spacial relationship between the anatomical location(s) of the subject and a first location of the subject where the sensor is to be placed;
- providing (S4) an indication signal indicative of the first location of the subject based on the control signal.
12. The method of claim 11, wherein the indication signal is either a visible light signal or an audio signal;
13. The method of claim 11, further comprising:
- identifying an object in the image of the subject, said object being intended to be moved from a second location to the first location;
wherein the control signal is generated further according to the location(s) of the identified object in the image and the indication signal is an audio signal for guiding the object to be moved from the second location to the first location.
14. The method of claim 13, the method further comprising a step of marking the first location when the object has been moved to the first location according to the indication signal.
15. A computer program comprising computer program code means for causing a computer to perform the steps of the method as claimed in any one of claims 11 to 13 when said computer program is run on a computer.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2013080519 | 2013-07-31 | ||
CNPCT/CN2013/080519 | 2013-07-31 | ||
EP13191329.5 | 2013-11-04 | ||
EP13191329 | 2013-11-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015015385A1 true WO2015015385A1 (en) | 2015-02-05 |
Family
ID=51582444
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2014/063402 WO2015015385A1 (en) | 2013-07-31 | 2014-07-25 | System and method for guiding the placement of a sensor |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2015015385A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107708547A (en) * | 2015-06-12 | 2018-02-16 | 皇家飞利浦有限公司 | Surface myoelectric drawing system, logger and method |
EP3384832A1 (en) | 2017-04-06 | 2018-10-10 | Koninklijke Philips N.V. | Method and apparatus for providing guidance for placement of a wearable device |
CN110522411A (en) * | 2018-05-24 | 2019-12-03 | 浙江清华柔性电子技术研究院 | The fixation device of wearable device |
CN113229832A (en) * | 2021-03-24 | 2021-08-10 | 清华大学 | System and method for acquiring human motion information |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6546279B1 (en) * | 2001-10-12 | 2003-04-08 | University Of Florida | Computer controlled guidance of a biopsy needle |
FR2895267A1 (en) * | 2005-12-26 | 2007-06-29 | Sarl Bio Supply Sarl | Non-invasive navigation device for use during operation of implantation of knee prosthesis, has navigation system including unit analyzing bone representation to provide representation of axles of referred prosthesis implantation, on screen |
US20090118610A1 (en) | 2005-11-29 | 2009-05-07 | Karmarkar Parag V | Mri-guided localization and/or lead placement systems, related methods, devices and computer program products |
EP2258264A1 (en) * | 2009-06-03 | 2010-12-08 | BrainLAB AG | Express registration of body regions |
WO2011112843A1 (en) * | 2010-03-12 | 2011-09-15 | Inspire Medical Systems, Inc. | Method and system for identifying a location for nerve stimulation |
US20120144551A1 (en) * | 2010-12-09 | 2012-06-14 | Eric Guldalian | Conductive Garment |
-
2014
- 2014-07-25 WO PCT/IB2014/063402 patent/WO2015015385A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6546279B1 (en) * | 2001-10-12 | 2003-04-08 | University Of Florida | Computer controlled guidance of a biopsy needle |
US20090118610A1 (en) | 2005-11-29 | 2009-05-07 | Karmarkar Parag V | Mri-guided localization and/or lead placement systems, related methods, devices and computer program products |
FR2895267A1 (en) * | 2005-12-26 | 2007-06-29 | Sarl Bio Supply Sarl | Non-invasive navigation device for use during operation of implantation of knee prosthesis, has navigation system including unit analyzing bone representation to provide representation of axles of referred prosthesis implantation, on screen |
EP2258264A1 (en) * | 2009-06-03 | 2010-12-08 | BrainLAB AG | Express registration of body regions |
WO2011112843A1 (en) * | 2010-03-12 | 2011-09-15 | Inspire Medical Systems, Inc. | Method and system for identifying a location for nerve stimulation |
US20120144551A1 (en) * | 2010-12-09 | 2012-06-14 | Eric Guldalian | Conductive Garment |
Non-Patent Citations (1)
Title |
---|
HERMIE J HERMENS ET AL: "Development of recommendations for SEMG sensors and sensor placement procedures", JOURNAL OF ELECTROMYOGRAPHY AND KINESIOLOGY, vol. 10, no. 5, 1 October 2000 (2000-10-01), pages 361 - 374, XP055110773, ISSN: 1050-6411, DOI: 10.1016/S1050-6411(00)00027-4 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107708547A (en) * | 2015-06-12 | 2018-02-16 | 皇家飞利浦有限公司 | Surface myoelectric drawing system, logger and method |
EP3384832A1 (en) | 2017-04-06 | 2018-10-10 | Koninklijke Philips N.V. | Method and apparatus for providing guidance for placement of a wearable device |
WO2018185122A1 (en) | 2017-04-06 | 2018-10-11 | Koninklijke Philips N.V. | Method and apparatus for providing guidance for placement of a wearable device |
CN110522411A (en) * | 2018-05-24 | 2019-12-03 | 浙江清华柔性电子技术研究院 | The fixation device of wearable device |
CN110522409A (en) * | 2018-05-24 | 2019-12-03 | 浙江清华柔性电子技术研究院 | The fixation method for determining position of wearable device and fixed device |
CN113229832A (en) * | 2021-03-24 | 2021-08-10 | 清华大学 | System and method for acquiring human motion information |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10932689B2 (en) | Model registration system and method | |
JP6623226B2 (en) | Jaw movement tracking | |
US10881353B2 (en) | Machine-guided imaging techniques | |
US20180098816A1 (en) | Pre-Operative Registration of Anatomical Images with a Position-Tracking System Using Ultrasound | |
US9974615B2 (en) | Determining a position of a medical device to be localized | |
CN105377174A (en) | Tracking apparatus for tracking an object with respect to a body | |
JP5569711B2 (en) | Surgery support system | |
US20210186355A1 (en) | Model registration system and method | |
KR20160034104A (en) | Optical tracking system and registration method for coordinate system in optical tracking system | |
WO2015015385A1 (en) | System and method for guiding the placement of a sensor | |
JP2019005635A (en) | Radiation-free position calibration of fluoroscope | |
JP2007315827A (en) | Optical bioinstrumentation device, program therefor, and optical bioinstrumentation method | |
CN107440682B (en) | Method and system for determining the position of an electrode on a patient's body | |
US9039283B2 (en) | Method and apparatus for producing an X-ray projection image in a desired direction | |
JP6731704B2 (en) | A system for precisely guiding a surgical procedure for a patient | |
KR101716352B1 (en) | Method and apparatus for guiding adjust position for sensing terminal | |
CN111714204A (en) | Technique for registered transfer of image data of a surgical object | |
KR101398193B1 (en) | Device and Method for Calibration | |
US20210307723A1 (en) | Spatial registration method for imaging devices | |
US20230225796A1 (en) | Technique For Determining A Need For A Re-Registration Of A Patient Tracker Tracked By A Camera System | |
US11439336B2 (en) | Biological information measurement system and recording medium | |
US20210338338A1 (en) | Field Generator Assembly And Method For Surgical Navigation | |
CN110269679A (en) | Medical-technical system and method for non-intrusion type tracking object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14771384 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14771384 Country of ref document: EP Kind code of ref document: A1 |