WO2007030868A1 - Method and system for detecting and classifying facial muscle movements - Google Patents
Method and system for detecting and classifying facial muscle movements Download PDFInfo
- Publication number
- WO2007030868A1 WO2007030868A1 PCT/AU2006/001331 AU2006001331W WO2007030868A1 WO 2007030868 A1 WO2007030868 A1 WO 2007030868A1 AU 2006001331 W AU2006001331 W AU 2006001331W WO 2007030868 A1 WO2007030868 A1 WO 2007030868A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- bio
- signals
- facial muscle
- signal
- muscle movement
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7239—Details of waveform analysis using differentiation including higher order derivatives
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7253—Details of waveform analysis characterised by using transforms
- A61B5/726—Details of waveform analysis characterised by using transforms using Wavelet transforms
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Definitions
- the present invention relates generally to the detection and classification of facial muscle movements, such as facial expressions or other types of muscle activity, in human subjects.
- the invention is suitable for use in electronic entertainment or other platforms in which electroencephalograph (EEG) data is collected and analysed in order to determine a subject's facial expression in order to provide control signals to that platform, and it will be convenient to describe the invention in relation to that exemplary, non-limiting application.
- EEG electroencephalograph
- Facial expression has long been one of the most important aspects of human to human communication. Humans have become accustomed to consciously and unconsciously showing our feelings and attitudes using facial expressions. Furthermore, we have become highly skilled at reading and interpreting facial expressions of others. Facial expressions form a very powerful part of our everyday life, everyday communications and interactions.
- one aspect of the invention provides a method of detecting and classifying facial muscle movements, including the steps of:
- the step of applying at least one facial movement-detection algorithm to the bio-signals may include:
- the step of applying at least one facial muscle movement-detection algorithm to the bio-signals may include:
- the step of applying at least one facial muscle movement-detection algorithm to the bio-signals may include:
- the predetermined component vectors may be determined from applying a first component analysis to historically collected bio-signals generated during facial muscle movements of the type corresponding to that first signature.
- the first component analysis applied to the historically collected bio-signals may be independent component analysis (ICA).
- the first component analysis applied to the historically collected bio-signals may be principal component analysis (PCA).
- the method may further include the steps of:
- the second component analysis may be principal component analysis (PCA).
- PCA principal component analysis
- the step of applying at least one facial muscle movement-detection algorithm to the bio-signals may include:
- the desired transform may be selected from any one or more of a Fourier transform, wavelet transform or other signal transformation method.
- the step of applying at least one facial muscle movement-detection algorithm to the bio-signals may further include the step of:
- bio-signals resulting from the predefined type of facial muscle movement from one or more sources of noise in the bio-signals.
- the sources of noise may include any one or more of electromagnetic interference (EMI), bio-signals not resulting from the predefined type of facial muscle movement and other muscle artefacts.
- EMI electromagnetic interference
- the step of applying one or more than one facial muscle movement- detection algorithm to the bio-signals may include comparing the sum or difference of bio-signals from one or more pairs of bio-signal detectors to that signature.
- the step of applying one or more than one facial muscle movement- detection algorithm to the bio-signals may further include comparing bio-signals from each of the one or more pairs of bio-signal detectors to that signature.
- the comparing step may include:
- the comparing step may further include:
- the comparing step may further include:
- the step of applying one or more than one facial muscle movement- detection algorithm to the bio-signals may include comparing the power of bio- signals from one or more predetermined bio-signal detector to that signature.
- the comparing step may include summing the power of bio-signals from one or more pairs of bio-signal detectors to that signature;
- the comparing step may include computing the ratio of the power of bio- signals from a first group of bio-signal detectors to the power of bio-signals from a second group of bio-signal detectors;
- the bio-signals may include any one or more of electroencephalograph (EEG) signals, electrooculograph (EOG) signals and electromyography (EMG) signals
- the method may further include the step of:
- Another aspect of the invention provides an apparatus for detecting and classifying facial muscle movements, including: a processor and associated memory device for causing the processor to carry out the method described above.
- Yet another aspect of the invention provides a computer program product, tangibly stored on machine readable medium, the product comprising instructions operable to cause a processor to carry out the method described above.
- a further aspect of the invention provides a computer program product comprising instructions operable to cause a processor to carry out the method described above.
- Figure 1 is a schematic diagram of an apparatus for detecting and classifying facial muscle movements in accordance with the present invention
- Figure 2 is a schematic diagram illustrating the positioning of scalp electrodes forming part of a head set used in the apparatus shown in Figure 1 ;
- Figure 3 is a flow chart illustrating the broad functional steps performed by the apparatus in Figure 1.
- Figures 4 and 5 represent exemplary signals from selected electrodes shown in Figure 2 during predefined facial movements
- Figure 6 is a representation of signals from the scalp electrode shown in Figure 2 during a number of facial muscle movements
- Figure 7 is a flow chart illustrating the steps performed in the development of signatures defining distinctive signal characteristics of predefined facial muscle movement types used in the apparatus of Figure 1 during the detection and classification of facial muscle movement;
- Figure 8 is a conceptual representation of the decomposition of signals from the sensors shown in Figure 2 into predetermined components as performed by the apparatus of Figure 1 , in at least one mode of operation;
- Figure 9 is a representation of a signal from one of the sensors shown in Figure 2 during a sequence of eye blinks
- Figure 10 is a flow chart illustrating the steps performed by the apparatus of Figure 1 both before and during bio-signal detection and classification in at least one mode of operation;
- Figure 1 1 is a schematic diagram showing an eye blink component vector present in the bio-signals captured from the sensors shown in Figure 2 during an exemplary eye blink;
- Figure 12 is a flow chart of one exemplary algorithm for detecting and classifying facial muscle movements as eye blinks
- Figure 13 shows a representation of a bio-signal detected from an exemplary sensor shown in Figure 2 and subsequent analysis performed on that bio-signal;
- Figure 14 represents a flow chart of another exemplary algorithm for detecting and classifying facial muscle movements as eye blinks
- Figure 15 is a flow chart of one exemplary algorithm for detecting and classifying facial muscle movements as smiles or clenches.
- Figure 16 is a representation of signals from the sensors shown in Figure 2 during a smile.
- the apparatus 100 includes a headset 102 of bio-signal detectors capable of detecting various bio-signals from a subject, particularly electrical signals produced by the body, such as electroencephalograph (EEG) signals, electrooculograph (EOG) signals, electomyograph (EMG) signals or like signals.
- the apparatus 100 can also include bio-signal detectors capable of detecting other physiological signals, such as skin conductance.
- the headset 102 includes a series of scalp electrodes for capturing EEG signals from the user.
- the scalp electrodes may directly contact the scalp or alternately may be of the non-contact type that does not require direct placement on the scalp.
- the headset is generally portable and non- constraining.
- the electrical fluctuations detected over the scalp by the series of scalp sensors are attributed largely to brain tissue located at or near the skull.
- the source is the electrical activity of the cerebral cortex, a significant portion of which lies on the outer surface of the brain below the scalp.
- the scalp electrodes pick up electrical signals naturally produced by the brain and make it possible to observe electrical impulses across the surface of the brain.
- the headset 102 includes several scalp electrodes, in other embodiments only one or more scalp electrodes, e.g. sixteen electrodes, may be used in a headset.
- Traditional EEG analysis has focused solely on these signals from the brain.
- the main applications have been explorative research in which different rhythms (alpha wave, beta wave, etc) have been identified, pathology detection in which onset of dementia or physical injury can be detected, and self improvement devices in which bio-feedback is used to aid in various forms of meditation.
- Traditional EEG analysis considers signals resulting from facial muscle movement such as eye blinks to be artefacts that mask the real EEG signal desired to be analysed. Various procedures and operations are performed to filter these artefacts out of the EEG signals selected.
- the applicants have developed technology that enables the sensing and collecting of electrical signals from the scalp electrodes, and the application of signal processing techniques to analyze these signals in order to detect and classify human facial expressions such as blinking, winking, frowning, smiling, laughing, talking etc.
- the result of this analysis is able to be used by a variety of other applications, including but not being limited to electronic entertainment applications, computer programs and simulators.
- Each of the signals detected by the headset 102 of electrodes is fed through a sensor interface 104, which can include an amplifier to boost signal strength and a filter to remove noise, and then digitized by an analogue-to- digital converter 106. Digitized samples of the signal captured by each of the scalp sensors are stored during operation of the apparatus 100 in a data buffer
- the apparatus 100 further includes a processing system 109 including a digital signal processor 1 12, a co-processing device 1 10 and associated memory device for storing a series of instructions (otherwise known as a computer program or computer control logic) to cause the processing system
- the memory includes a series of instructions defining at least one algorithm 1 14 to be performed by the digital signal processor 1 12 for detecting and classifying a predetermined type of facial muscle movement.
- a corresponding control signal is transmitted in this exemplary embodiment to an input/output interface 1 16 for transmission via a wireless transmission device 1 18 to a platform 120 for use as a control input by electronic entertainment applications, programs, simulators or the like.
- the algorithms are implemented in software and the series of instructions is stored in the memory of the processing system, e.g., in the memory of the processing system 109.
- the series of instructions causes the processing system 109 to perform the functions of the invention as described herein.
- the instructions Prior to being loaded into the memory, the instructions can be tangibly embodied in a machine readable storage device, such as a computer disk or memory card, or in a propagated signal.
- the algorithms are implemented primarily in hardware using, for example, hardware components such as application specific integrated circuits (ASICs). Implementation of the hardware state machine so as to perform the functions described herein will be apparent to persons skilled in the relevant art.
- the algorithms are implemented using a combination of software and hardware.
- an FPGA field programmable gate array
- the processing functions could be performed by a single processor.
- the buffer 108 could be eliminated or replaced by a multiplexer (MUX), and the data stored directly in the memory of the processing system.
- MUX could be placed before the A/D converter stage so that only a single A/D converter is needed.
- the connection between the apparatus 100 and the platform 120 can be wired rather than wireless.
- the apparatus 100 can include a head set assembly that includes the head set, a MUX, A/D converter(s) before or after the MUX, a wireless transmission device, a battery for power supply, and a microcontroller to control battery use, send data from the MUX or A/D converter to the wireless chip, and the like.
- the apparatus can also include a separate processor unit that includes a wireless receiver to receive data from the headset assembly, and the processing system, e.g. the digital signal processor and the co-processor.
- the processor unit can be connected to the platform by a wired or wireless connection.
- the apparatus can include a head set assembly as described above, the platform can include a wireless receiver to receive data from the headset assembly, and a digital signal processor dedicated to detection of facial muscle movement can be integrated directly into the platform.
- the apparatus can include a head set assembly as described above, the platform can include a wireless receiver to receive data from the headset assembly, and the facial muscle movement detection algorithms are performed in the platform by the same processor, e.g., a general purpose digital processor, that executes the application, programs, simulators or the like.
- Figure 2 illustrates one example of the positioning system 200 of the scalp electrodes forming part of the headset 102.
- the system 200 of electrode placement shown in Figure 2 is referred to as the "10-20" system and is based on the relationship between the location of an electrode and the underlying area of cerebral cortex.
- Each point on the electrode placement system 200 indicates a possible scalp electrode position.
- Each site includes a letter to identify the lobe and a number or other letter to identify the hemisphere location.
- the letters F, T, C, P, O stand for Frontal, Temporal, Central, Parietal and Occipital. Even numbers refer to the right hemisphere and odd numbers refer to the left hemisphere.
- the letter Z refers to an electrode placed on the mid-line.
- the midline is a line along the scalp on the sagittal plane originating at the nasion and ending at the inion at the back of the head.
- the "10” and “20” refer to percentages of the mid-line division.
- the mid-line is divided into 7 positions, namely, Nasion, Fpz, Fz, Cz, Pz, Oz and Inion, and the angular intervals between adjacent positions are 10%, 20%, 20%, 20%, 20% and 10% of the mid-line length respectively.
- the headset 102 including scalp electrodes positioned according to the system 200, is placed on the head of a subject in order to detect EEG signals.
- the EEG signals are captured by a neuro- physiological signal acquisition device and then converted into the digital domain at step 302 using the analogue to digital converters 106.
- a series of digitized signals from each of the sensors is then stored at step 304 in the data buffer 108.
- One or more facial muscle movement-detection algorithms are then applied at step 306 in order to detect and classify different facial muscle movements, including facial expressions or other muscle movements.
- Each of the algorithms generates a result representing the facial expression(s) of the subject.
- Figure 4 shows a representation 400 of a signal from the Fp1 or Fp2 electrode (as seen in the electrode positioning system 200 shown in Figure 2) during a series of eye blinks.
- Figure 5 shows a representation 500 of a signal from the T7 or T8 electrode resulting from a series of smiles by a subject.
- Figure 6 shows a representation 600 of the signals from each of the electrodes in the headset 102 when various eye movements are performed by the subject.
- the impact of an up, down, left and right eye movement can be observed from the circled portions of signal representations.
- the apparatus 100 acts to isolate these perturbations and then apply one or more algorithms in order to classify the type of facial muscle movement responsible for producing the perturbations.
- the apparatus 100 applies at least one facial muscle movement-detection algorithm 1 14 to a portion of the bio-signals captured by the headset 102 affected by a predefined type of facial muscle movement in order to detect facial muscle movements of that predefined type.
- a mathematical signature defining one or more distinctive characteristics of the predefined facial muscle movement type is stored in the memory device 1 12. The relevant portion of the bio-signals affected by the predefined type of facial muscle movement is then compared to that mathematical signature.
- stimuli are developed at step 700 to elicit that particular facial expression.
- the stimuli are generally in the form of an audio visual presentation or a set of commands.
- the set of stimuli is tested at step 702 until a high degree of correlation between the developed stimuli and the resultant desired facial muscle movement is obtained.
- EEG signal recordings are made at step 704 that contain many examples of the desired facial muscle movements. Ideally, these facial muscle movements should be as natural as possible.
- step 706 signal processing operations are then performed at step 706 in order to identify one or more distinctive signal characteristics of each predefined facial muscle movement type. Identification of these distinctive signal characteristics in each EEG signal recording enables classification of the facial muscle movement in a subject to be classified at step 708 and an output signal representative of the detected type of facial muscle movement to be output at step 710. Testing and verification of the output signal at step 712 enables a robust data set to be established.
- a mathematical signature it may be necessary to develop a mathematical signature for each subject.
- a generic mathematical signature can be developed for each type of facial muscle movement, e.g., using a limited number of subjects, and stored in the memory of the digital signal processor 1 12 without requiring the aforementioned steps to be carried out by each subject.
- the portion of the bio-signals affected by a predefined type of facial muscle movement is predominantly found in signals from a limited number of scalp electrodes.
- eye movement and blinking can be detected by using only two electrodes near the eyes, such as the Fp1 and Fp2 channels shown in Figure 2.
- signals from those electrodes can be directly compared to the mathematical signatures defining the distinctive signal characteristics of the eye blink or other predefined facial muscle movement type.
- a weighting may be applied to each signal prior to the signal combining operation in order to improve the accuracy of the facial muscle movement detection and classification.
- the apparatus 100 acts to decompose the scalp electrode signals into a series of components and then to compare the projection of the bio-signals from the scalp electrodes onto one or more predetermined component vectors with the mathematical signatures defining the signal characteristics of each type of facial muscle movement.
- independent component analysis has been found to be useful for defining the characteristic forms of the potential function across the entire scalp.
- Independent component analysis maximizes the degree of statistical independence among outputs using a series of contrast functions.
- the rows of an input matrix X represent data samples from the bio-signals in the headset 102 recorded at different electrodes whereas the columns are measurements recorded at different time points.
- Independent component analysis finds an "unmixing" matrix W which decomposes or linearly unmixes the multi-channel scalp data into a sum of temporarily independent and specially fixed components.
- the rows of the output data matrix U WX are time courses of activation of the ICA components.
- the columns of the inverse matrix, W-1 give the relative projection strength of each of the signals from the scalp electrodes onto respective component vectors. These scalp weights give the scalp topography of each component vector.
- PCA principal component analysis
- independent component analysis and principal component analysis may be used in order to detect and classify facial muscle movements.
- the apparatus 100 may act to apply a desired Fourier transform to the bio-signals from the scalp electrodes.
- the transform could alternatively be a wavelet transform or any other suitable signal transformation method. Combinations of one or more different signal transformation methods may also be used. Portions of the bio-signals affected by a predefined type of facial muscle movement may then be identified using a neural network.
- Each of the above described techniques for detection and classification of the facial muscle movements may be incorporated into a facial muscle movement detection algorithm stored in the memory of and performed by the digital signal processor 1 12. Once a particular facial muscle movement detection algorithm has been fully developed, the algorithm may be implemented as a piece of real-time software program or transferred into a digital signal processing or other suitable environment.
- Eye blinks are present in all anterior electrodes but feature most prominently in the two frontal channels Fp1 and Fp2.
- Figure 9 is a representation 900 of the bio-signal recorded at the scalp electrode Fp1 during 3 typical eye blinks. It can be seen from signal portions 902, 904 and 906 of the bio-signal from the frontal channel Fp1 that each of the 3 eye blinks has a significant effect on the bio-signal.
- the projections of the bio-signals from the frontal electrodes Fp1 and Fp2 on predetermined component vectors are used to detect and classify the perturbation in the bio-signals as an eye blink.
- the predetermined component vectors are identified from historically collected data from a number of subjects and/or across a number of different sessions.
- the EEG data from a number of different subjects and/or across a number of different sessions are recorded at step 1000 when the desired facial muscle movements are being generated by the subjects.
- independent component analysis is performed on the recorded EEG data and the component vectors onto which are projected the perturbations in the EEG signals resulting from the relevant facial muscle movement are determined at step 1004.
- the relevant component vectors to be used in subsequent data recording and analysis are then recorded in the storage device 1 12 by facial muscle movement type.
- three exemplary types of facial muscle movement are able to be classified, namely vertical eye movement at step 1006, horizontal eye movement at step 1008 and an eye blink at step 1010.
- independent component analysis is a computationally time consuming activity and in many instances is inappropriate in some application, such as real-time use. Whilst independent component analysis may be used to generate average component vectors for use in the detection and classification of various types of facial muscle movements, the balance of signals across different electrodes vary slightly across different sessions and users.
- the average component vectors defined using independent component analysis of historically gathered data may not be optimal during realtime data detection and classification.
- principal component analysis can be performed on the real-time data and the resulting component vector can be used to update the component vector generated by independent component analysis throughout each session.
- the resulting facial muscle movement-detection algorithms can be made robust against electrodes shifting and variances in the strengths of electrode contact.
- the projection of the historically collected data on the vector component is initially used as a reference in the facial muscle movement detection algorithms 1 14.
- principal component analysis is carried out at step 1016 on the stored data, and the results of the analysis generated at step 1018 are then used to update the component vectors developed during offline independent component analysis.
- component vectors can be used in order that a correct weighting is applied to the contribution from the signals of each relevant electrode.
- An example of an eye-blink component vector is shown in the vector diagram 1 100 in Figure 1 1 . From this diagram it can be seen that the largest contribution to the component is indeed from the two frontal electrodes Fp1 and Fp2. However, it is also apparent that the eye blink is not symmetric. In this case, the potential around the electrode Fp2 is larger than that of the electrode Fp1. The difference may be due to a number of causes, for example, muscle asymmetry, the electrodes not being symmetrically located on the head of a subject or a difference in the electrical impedance contact with the scalp.
- This diagram illustrates the desirability of optimizing the component vectors during each session, for example by applying the steps illustrated in Figure 10.
- Figure 12 shows one example of a facial muscle movement-detection algorithm 1200 used to detect an eye blink.
- the algorithm 1200 may be applied to the activations of component vectors or alternatively may be applied to signals from individual scalp electrodes or from combinations of signals from more than one scalp electrode.
- the projection of the EEG signals onto the component vector associated with an eye blink is initially passed through a low pass filter at step 1202.
- a first order derivative operation is then performed on the signal at step 1204.
- the first order derivative of a function / with respect to an infinitesimal change in x is defined as
- zero-crossing points in the first order derivative signal which fall into two categories: positive zero-crossing point and negative zero-crossing point.
- the sign (namely either positive or negative) of the zero- crossing points indicates whether the signal increases or decreases after crossing the axis.
- positive zero-crossing points define boundary conditions of an eye blink.
- a negative zero-crossing point 1310 defines the peak of the eye blink. Accordingly, the algorithm 1200 determines at step 1206 whether a zero-crossing point occurs in the digitized data stored in data buffer 108. If this is the case, a determination is made at step 1208 if the crossing type is a positive or a negative zero-crossing.
- the peak amplitude of the corresponding signal is checked at step 1210 to verify whether this transitory rise in signal amplitude is from a real eye blink. If a positive zero-crossing point is detected, the algorithm stores this information into state queue at step 1214 in cases where there is no preceding negative zero-crossing point whose corresponding signal amplitude satisfying the peak value condition determined at step 1212 stored in the queue. If there is a preceding negative zero-crossing point stored in the state queue, an assertion that there is an eye blink is made at step 1212. The algorithm resets if the found negative zero-crossing point does not satisfy peak value condition or an eye blink detection assertion is made.
- the algorithm verifies whether there exists a negative zero-crossing point sandwiched between the two positive zero-crossing points, and the eye blink peak passes amplitude threshold.
- a default value of the amplitude threshold is initially made, but to increase the accuracy of the algorithm, the threshold amplitude is optionally adjusted at step 1218 based upon the strength of an individual's eye blink peaks.
- the eye blink "signature” defines the distinctive signal characteristics representative of an eye blink, namely a negative zero crossing sandwiched between two positive zero crossings in the first order derivative of the filtered signal, and a signal amplitude greater than a predetermined threshold in the filtered signal.
- the signature is optionally updated by changing the threshold forming part of the distinctive signal characteristics of the signature during facial muscle movement detection and classification.
- the digital signature may define other amplitudes or signal characteristics that exceed one or more predetermined thresholds.
- the signature may be updated during facial muscle movement detection and classification by changing one or more of those thresholds. More generally, any one or more distinctive signal characteristics of a predetermined facial muscle movement type that form part of a digital signature can be updated during the course of facial muscle movement detection and classification in order to improve the viability and accuracy of the facial muscle movement detection algorithms implemented by the apparatus 100.
- FIG 14 shows another example of a facial muscle movement detection algorithm 1400 used to detect an eye blink.
- the algorithm 1400 involves examining the correlation of signals between channel pairs, as well as the amplitude and gradient of bio-signals from each of the pair of channels Fp1 and Fp2, as well as the sum of bio-signals from that pair of channels.
- bio-signals from the sensors shown in Figure 2 are sampled at a rate of 256 Hz resulting in 256 samples per second being generated for each of the 32 channels.
- channels 1 and 30 corresponding respectively to bio-signals from the sensors Fp1 and Fp2 are extracted for a data window corresponding to one over 32 of a second containing 8 samples.
- samples from channels 1 and 30 are summed.
- a third order infinite impulse response (NR) low pass filter is applied at 10 Hz, whilst at step 1410 a first order NR high pass filter is applied at 0.125 Hz.
- a first order derivative operation is performed on the sum of bio-signals from channels Fp1 and Fp2. Similar to the aforementioned algorithm 1200, an eye blink peak is tracked by the negative zero-crossing point of the first order derivative. The rise and the fall of an eye blink signal peak are bounded by positive zero-crossing points preceding and following this negative zero-crossing point of the first order derivative, respectively.
- An assessment is made as to whether the correlation between the filtered signals of channels Fp1 and Fp2 for the window of data bounded by a positive zero-crossing and a negative zero-crossing for the rise of a peak (and vice versa for the fall of a peak) exceeds a first predetermined threshold at step 1414, whether the lesser amplitude of the rise or the fall of an eye blink peak signal from either of the individual channels Fp1 and Fp2 exceeds a second predetermined threshold at step 1416, and whether the maximum gradient determined from the peak or trough of the first order derivative of the summed signals from channels Fp1 and Fp2 exceeds a third predetermined threshold at step 1418. If these three values are above their respective thresholds in all cases, then an eye opening or eye closing event is detected at step 1420 (dependent on whether the maximum gradient is positive or negative).
- Similar algorithms can be used to identify winks, eyeball motions or other related facial muscle expressions.
- Other algorithms may use different combinations of signal correlation, amplitude displacement and signal gradient measurement, as well as assessing the sum or difference of bio-signals from one or more different channel pairs to those used in the algorithm illustrated in Figure 14.
- step 1504 data from several channel pairs is extracted, namely data from channel 6 and 1 1 corresponding to detectors T7 and T8, channels 2 and 15 corresponding to detectors AF3 and AF4, channels 4 and 13 corresponding to detectors FC5 and FC6 and channels 3 and 14 corresponding to detectors F7 and F8.
- a data window is created for each of these channels in which 64 consecutive samples (corresponding to a quarter of a second) are considered.
- the power of the signal represented by the 64 consecutive samples in the data window for each channel is calculated, and a single power value computed for each channel at step 1510.
- step 1512 the ratio of the power on channel pair T7 and T8 to the average of the power on channel pair AF3 and AF4 and channel pair FC5 and FC6 is computed.
- step 1514 a low pass filter is applied to the computation carried out at step 1512.
- step 1516 a determination is made as to whether the ratio computed at step 1512 exceeds a predetermined threshold indicative of a smile. If this is the case, then a smile is detected at step 1518, or otherwise step 1506 is performed again with the next 64 data samples for each of the channels extracted at step 1504.
- step 1512 Whilst step 1512 is being computed in order to determine whether a smile is present in the bio-signal data, at step 1522 the average power on channel pair FC5 and FC6 is being computed in order to determine whether a clench is present in the bio-signals. If it is determined at step 1526 that this computed average power exceeds a predetermined threshold indicative of a clench, then a clench is detected at step 1528. If the average power computed in step 1522 does not exceed the predetermined threshold indicative of a clench at step 1526, then the power ratio computed at step 1512 is compared to the threshold indicative of a smile in order to determine whether a smile is present in the bio-signals.
- a signal power profile can be created using signal power on all channels to create a signal power profile as a 32 channel vector.
- There are several ways of creating a signal power profile for an expression For example, a combination of principle component analysis, simple statistics such as mean, median and manual inspection can be used to create the signal power profile. The profile can then be normalized to a unit vector for scaling simplicity. The dot product of the signal power profile and the normalized signal power on all channels can then be used as the signal to identify when a particular facial expression occurs.
- signal power on a channel is calculated by taking the first order derivative of the channel signal, which is sometimes firstly passed through a high pass filter.
- the absolute value of the first order derivative of a channel signal is then taken, and some fraction of the lowest and highest values is discarded. These values may typically be 3/8 of the lowest and the 1/8 of the highest values.
- the mean of the non- discarded portion is then calculated.
- the data window used to calculate the signal power is normally in the order of 1 A of a second.
- Alternative methods of calculating signal power can be used in other embodiments of the invention. These methods may be based upon signal power present in particular frequency bands on channels, or the ratio of power in two different frequency bands or two different channels. These could be different channels on the same band, different bands on the same channel or different bands on different channels.
- the channel correlation performed in relation to the algorithm shown in Figure 14 can be calculated according to the expression sum (xy)/sqrt(sum(x 2 )+sum(y 2 )). In some embodiments of the invention however, an alternative measure of correlation is carried out: sum(xy)/sqrt(sum(max(x,y) 2 )). In both of the two preceding expressions, the terms x and y correspond to the two signals being calculated. This latter correlation method has the advantage of reducing the value in two signals that have similar profiles, but one is very much larger than the other.
- the facial expression detection algorithms described above can have multiple threshold values associated therewith. These threshold values may not be intuitive to adjust and it is therefore useful to be able to translate these thresholds into one or more intuitive "sensitivity" parameters.
- the effect that an eye blink has on the EEG signal from any one or more of the detectors shown in Figure 2 can be simply understood as concurrent upward deflections on a number of frontal channels. These deflections may be characterized by the minimum height of the deflection, the minimum gradients of the deflection and the minimum correlations of the signals on different channels. How these three thresholds are combined is not necessarily straightforward to the non-expert user. In order to convert the multiple thresholds to a single sensitivity parameter, it may be desirable to evaluate a reasonable range for each threshold. Each threshold may then be interpolated, either linearly or otherwise, between the minimum and maximum values of the reasonable range.
- the threshold value for that parameter is:
- threshold thresh_max+ (thresh_min- thresh max) * s
- sensitivity threshold may be inferred by any of the individual thresholds based on:
- the expressions detected via noise profiling may have very different values. Automatic calibration of these algorithms can be performed to cater for this variability. Calibration can be performed by recording a "neutral" state, defined as anything but the expressions that are being calibrated. Noise values are calculated for this period, the values obtained are sorted and the lower 50% of the values are discarded. The average of the remaining values is then used as a baseline above which we can assume that there is some small amount of expression present.
- Calibration of the expression's maximum threshold can then be done by using a multiple of the lower threshold, for example 2 or 3 times the lower threshold.
- the subject can be asked to perform the expression, and the values obtained during this period can be used to determine upper end of the facial expression range. Care should be taken as the perturbations caused by forced expressions may not be as large as naturally occurring expressions, so the maximum value found during such a calibration should be set to 50% of the range.
Abstract
A method of detecting and classifying facial muscle movements, comprising the steps of: receiving bio-signals from at least one bio-signal detector; and applying at least one facial muscle movement-detection algorithm to a portion of the bio-signals affected by a predefined type of facial muscle movement in order to detect facial muscle movements of that predefined type.
Description
METHOD AND SYSTEM FOR DETECTING AND CLASSIFYING FACIAL
MUSCLE MOVEMENTS
FIELD
The present invention relates generally to the detection and classification of facial muscle movements, such as facial expressions or other types of muscle activity, in human subjects. The invention is suitable for use in electronic entertainment or other platforms in which electroencephalograph (EEG) data is collected and analysed in order to determine a subject's facial expression in order to provide control signals to that platform, and it will be convenient to describe the invention in relation to that exemplary, non-limiting application.
BACKGROUND
Facial expression has long been one of the most important aspects of human to human communication. Humans have become accustomed to consciously and unconsciously showing our feelings and attitudes using facial expressions. Furthermore, we have become highly skilled at reading and interpreting facial expressions of others. Facial expressions form a very powerful part of our everyday life, everyday communications and interactions.
As technology progresses, more of our communication is mediated by machines. People now "congregate" in virtual chat rooms to discuss issues with other people. Text messaging is becoming more popular, resulting in new orthographic systems being developed in order to cope with this unhuman world. Currently, facial expressions have not been used in man machine communication interfaces. Interactions with machines are restricted to the use of cumbersome input devices such as keyboards and joysticks. This limits our communication to only premeditated and conscious actions.
There therefore exists a need to provide technology that simplifies man- machine communications. It would moreover be desirable for this technology to
be robust, powerful and adaptable to a number of platforms and environments. It would also be desirable for this technology to optimise the use of natural human to human interaction techniques so that the man-machine interface is as natural as possible for a human user.
SUMMARY
With this in mind, one aspect of the invention provides a method of detecting and classifying facial muscle movements, including the steps of:
receiving bio-signals from at least one bio-signal detector; and
applying at least one facial muscle movement-detection algorithm to a portion of the bio-signals affected by a predefined type of facial muscle movement in order to detect facial muscle movements of that predefined type.
The step of applying at least one facial movement-detection algorithm to the bio-signals may include:
comparing the bio-signal portion to a signature defining one or more distinctive signal characteristic of the predefined facial muscle movement type.
In a first embodiment of the invention, the step of applying at least one facial muscle movement-detection algorithm to the bio-signals may include:
directly comparing bio-signals from one or more predetermined bio-signal detectors to the signature.
In another embodiment of the invention, the step of applying at least one facial muscle movement-detection algorithm to the bio-signals may include:
projecting bio-signals from a plurality of bio-signal detectors onto one or more predetermined component vectors; and
comparing the projections onto the one or more component vectors to that signature.
The predetermined component vectors may be determined from applying a first component analysis to historically collected bio-signals generated during facial muscle movements of the type corresponding to that first signature. The first component analysis applied to the historically collected bio-signals may be independent component analysis (ICA). Alternatively, the first component analysis applied to the historically collected bio-signals may be principal component analysis (PCA). In this embodiment, the method may further include the steps of:
applying a second component analysis to the detected bio-signals; and
using the results of the second component analysis to update the one or more predetermined component vectors during bio-signal detection.
The second component analysis may be principal component analysis (PCA).
In yet another embodiment of the invention, the step of applying at least one facial muscle movement-detection algorithm to the bio-signals may include:
applying a desired transform to the bio-signals; and
comparing the results of the desired transform to that signature.
The desired transform may be selected from any one or more of a Fourier transform, wavelet transform or other signal transformation method.
The step of applying at least one facial muscle movement-detection algorithm to the bio-signals may further include the step of:
separating the bio-signals resulting from the predefined type of facial
muscle movement from one or more sources of noise in the bio-signals.
The sources of noise may include any one or more of electromagnetic interference (EMI), bio-signals not resulting from the predefined type of facial muscle movement and other muscle artefacts.
The step of applying one or more than one facial muscle movement- detection algorithm to the bio-signals may include comparing the sum or difference of bio-signals from one or more pairs of bio-signal detectors to that signature.
The step of applying one or more than one facial muscle movement- detection algorithm to the bio-signals may further include comparing bio-signals from each of the one or more pairs of bio-signal detectors to that signature.
The comparing step may include:
tracking a derivative of one or more that one of the bio-signals from each of the one or more pairs of bio-signal detectors and the sum or difference of bio- signals from the one or more pairs of bio-signal detectors.
The comparing step may further include:
comparing one or both of gradient and amplitude for one or more that one of the bio-signals from each of the one or more pairs of bio-signal detectors and the sum or difference of bio-signals from the one or more pairs of bio-signal detectors; and
determining when one or both of the gradient and amplitude respectively exceeds predetermined gradient and amplitude thresholds.
The comparing step may further include:
computing the correlation between bio-signals from each of the one or more pairs of bio-signal detectors; and
determining when the correlation exceeds a predetermined correlation threshold.
The step of applying one or more than one facial muscle movement- detection algorithm to the bio-signals may include comparing the power of bio- signals from one or more predetermined bio-signal detector to that signature.
The comparing step may include summing the power of bio-signals from one or more pairs of bio-signal detectors to that signature; and
determining whether the sum exceeds a predetermined threshold indicative of a first facial muscle movement type.
The comparing step may include computing the ratio of the power of bio- signals from a first group of bio-signal detectors to the power of bio-signals from a second group of bio-signal detectors; and
determining whether the ratio exceeds a predetermined threshold indicative of a second facial muscle movement type.
In one or more embodiments of the invention, the bio-signals may include any one or more of electroencephalograph (EEG) signals, electrooculograph (EOG) signals and electromyography (EMG) signals
The method may further include the step of:
generating an output signal representative of the detected facial muscle movement type for input to an electronic entertainment application or other application.
Another aspect of the invention provides an apparatus for detecting and classifying facial muscle movements, including: a processor and associated memory device for causing the processor to
carry out the method described above.
Yet another aspect of the invention provides a computer program product, tangibly stored on machine readable medium, the product comprising instructions operable to cause a processor to carry out the method described above.
A further aspect of the invention provides a computer program product comprising instructions operable to cause a processor to carry out the method described above.
FIGURES
These and other features, aspects and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying figures which depict various views and embodiments of the device, and some of the steps in certain embodiments of the method of the present invention, where:
Figure 1 is a schematic diagram of an apparatus for detecting and classifying facial muscle movements in accordance with the present invention;
Figure 2 is a schematic diagram illustrating the positioning of scalp electrodes forming part of a head set used in the apparatus shown in Figure 1 ;
Figure 3 is a flow chart illustrating the broad functional steps performed by the apparatus in Figure 1.
Figures 4 and 5 represent exemplary signals from selected electrodes shown in Figure 2 during predefined facial movements;
Figure 6 is a representation of signals from the scalp electrode shown in Figure 2 during a number of facial muscle movements;
Figure 7 is a flow chart illustrating the steps performed in the development of signatures defining distinctive signal characteristics of predefined facial muscle movement types used in the apparatus of Figure 1 during the detection and classification of facial muscle movement;
Figure 8 is a conceptual representation of the decomposition of signals from the sensors shown in Figure 2 into predetermined components as performed by the apparatus of Figure 1 , in at least one mode of operation;
Figure 9 is a representation of a signal from one of the sensors shown in Figure 2 during a sequence of eye blinks;
Figure 10 is a flow chart illustrating the steps performed by the apparatus of Figure 1 both before and during bio-signal detection and classification in at least one mode of operation;
Figure 1 1 is a schematic diagram showing an eye blink component vector present in the bio-signals captured from the sensors shown in Figure 2 during an exemplary eye blink;
Figure 12 is a flow chart of one exemplary algorithm for detecting and classifying facial muscle movements as eye blinks;
Figure 13 shows a representation of a bio-signal detected from an exemplary sensor shown in Figure 2 and subsequent analysis performed on that bio-signal;
Figure 14 represents a flow chart of another exemplary algorithm for detecting and classifying facial muscle movements as eye blinks;
Figure 15 is a flow chart of one exemplary algorithm for detecting and classifying facial muscle movements as smiles or clenches; and
Figure 16 is a representation of signals from the sensors shown in Figure
2 during a smile.
DETAILED DECRIPTION
Turning now to Figure 1 , there is shown generally an apparatus 100 for detecting and classifying facial muscle movements. The apparatus 100 includes a headset 102 of bio-signal detectors capable of detecting various bio-signals from a subject, particularly electrical signals produced by the body, such as electroencephalograph (EEG) signals, electrooculograph (EOG) signals, electomyograph (EMG) signals or like signals. The apparatus 100 can also include bio-signal detectors capable of detecting other physiological signals, such as skin conductance. In the exemplary embodiment illustrated in the drawings, the headset 102 includes a series of scalp electrodes for capturing EEG signals from the user. The scalp electrodes may directly contact the scalp or alternately may be of the non-contact type that does not require direct placement on the scalp. Unlike systems that provide high-resolution 3-D brain scans, e.g., MRI or CAT scans, the headset is generally portable and non- constraining.
The electrical fluctuations detected over the scalp by the series of scalp sensors are attributed largely to brain tissue located at or near the skull. The source is the electrical activity of the cerebral cortex, a significant portion of which lies on the outer surface of the brain below the scalp. The scalp electrodes pick up electrical signals naturally produced by the brain and make it possible to observe electrical impulses across the surface of the brain. Although in this exemplary embodiment the headset 102 includes several scalp electrodes, in other embodiments only one or more scalp electrodes, e.g. sixteen electrodes, may be used in a headset.
Traditional EEG analysis has focused solely on these signals from the brain. The main applications have been explorative research in which different rhythms (alpha wave, beta wave, etc) have been identified, pathology detection in which onset of dementia or physical injury can be detected, and self improvement devices in which bio-feedback is used to aid in various forms of
meditation. Traditional EEG analysis considers signals resulting from facial muscle movement such as eye blinks to be artefacts that mask the real EEG signal desired to be analysed. Various procedures and operations are performed to filter these artefacts out of the EEG signals selected.
The applicants have developed technology that enables the sensing and collecting of electrical signals from the scalp electrodes, and the application of signal processing techniques to analyze these signals in order to detect and classify human facial expressions such as blinking, winking, frowning, smiling, laughing, talking etc. The result of this analysis is able to be used by a variety of other applications, including but not being limited to electronic entertainment applications, computer programs and simulators.
Each of the signals detected by the headset 102 of electrodes is fed through a sensor interface 104, which can include an amplifier to boost signal strength and a filter to remove noise, and then digitized by an analogue-to- digital converter 106. Digitized samples of the signal captured by each of the scalp sensors are stored during operation of the apparatus 100 in a data buffer
108 for subsequent processing.
The apparatus 100 further includes a processing system 109 including a digital signal processor 1 12, a co-processing device 1 10 and associated memory device for storing a series of instructions (otherwise known as a computer program or computer control logic) to cause the processing system
109 to perform desired functional steps. Notably, the memory includes a series of instructions defining at least one algorithm 1 14 to be performed by the digital signal processor 1 12 for detecting and classifying a predetermined type of facial muscle movement. Upon detection of each predefined type of facial muscle movement, a corresponding control signal is transmitted in this exemplary embodiment to an input/output interface 1 16 for transmission via a wireless transmission device 1 18 to a platform 120 for use as a control input by electronic entertainment applications, programs, simulators or the like.
In one embodiment, the algorithms are implemented in software and the
series of instructions is stored in the memory of the processing system, e.g., in the memory of the processing system 109. The series of instructions causes the processing system 109 to perform the functions of the invention as described herein. Prior to being loaded into the memory, the instructions can be tangibly embodied in a machine readable storage device, such as a computer disk or memory card, or in a propagated signal. In another embodiment, the algorithms are implemented primarily in hardware using, for example, hardware components such as application specific integrated circuits (ASICs). Implementation of the hardware state machine so as to perform the functions described herein will be apparent to persons skilled in the relevant art. In yet another embodiment, the algorithms are implemented using a combination of software and hardware.
Other implementations of the apparatus 100 are possible. Instead of a digital signal processor, an FPGA (field programmable gate array) could be used. Rather than a separate digital signal processor and co-processor, the processing functions could be performed by a single processor. The buffer 108 could be eliminated or replaced by a multiplexer (MUX), and the data stored directly in the memory of the processing system. A MUX could be placed before the A/D converter stage so that only a single A/D converter is needed. The connection between the apparatus 100 and the platform 120 can be wired rather than wireless.
Although the apparatus 100 is illustrated in Figure 1 with all processing functions occurring in a single device that is external to the platform, other implementations are possible. For example, the apparatus can include a head set assembly that includes the head set, a MUX, A/D converter(s) before or after the MUX, a wireless transmission device, a battery for power supply, and a microcontroller to control battery use, send data from the MUX or A/D converter to the wireless chip, and the like. The apparatus can also include a separate processor unit that includes a wireless receiver to receive data from the headset assembly, and the processing system, e.g. the digital signal processor and the co-processor. The processor unit can be connected to the platform by a wired or wireless connection. As another example, the apparatus can include a head
set assembly as described above, the platform can include a wireless receiver to receive data from the headset assembly, and a digital signal processor dedicated to detection of facial muscle movement can be integrated directly into the platform. As yet another example, the apparatus can include a head set assembly as described above, the platform can include a wireless receiver to receive data from the headset assembly, and the facial muscle movement detection algorithms are performed in the platform by the same processor, e.g., a general purpose digital processor, that executes the application, programs, simulators or the like.
Figure 2 illustrates one example of the positioning system 200 of the scalp electrodes forming part of the headset 102. The system 200 of electrode placement shown in Figure 2 is referred to as the "10-20" system and is based on the relationship between the location of an electrode and the underlying area of cerebral cortex. Each point on the electrode placement system 200 indicates a possible scalp electrode position. Each site includes a letter to identify the lobe and a number or other letter to identify the hemisphere location. The letters F, T, C, P, O stand for Frontal, Temporal, Central, Parietal and Occipital. Even numbers refer to the right hemisphere and odd numbers refer to the left hemisphere. The letter Z refers to an electrode placed on the mid-line. The midline is a line along the scalp on the sagittal plane originating at the nasion and ending at the inion at the back of the head. The "10" and "20" refer to percentages of the mid-line division. The mid-line is divided into 7 positions, namely, Nasion, Fpz, Fz, Cz, Pz, Oz and Inion, and the angular intervals between adjacent positions are 10%, 20%, 20%, 20%, 20% and 10% of the mid-line length respectively.
The headset 102, including scalp electrodes positioned according to the system 200, is placed on the head of a subject in order to detect EEG signals. As seen in Figure 3, at step 300, the EEG signals are captured by a neuro- physiological signal acquisition device and then converted into the digital domain at step 302 using the analogue to digital converters 106. A series of digitized signals from each of the sensors is then stored at step 304 in the data buffer 108. One or more facial muscle movement-detection algorithms are then
applied at step 306 in order to detect and classify different facial muscle movements, including facial expressions or other muscle movements. Each of the algorithms generates a result representing the facial expression(s) of the subject. These results are then passed on to the output block 1 16 at step 308 where they can be used by a variety of applications.
In traditional EEG research, many signals resulting from eye blinks and other facial muscle movements have been considered to be artefacts masking the real EEG signal required for analysis. Figure 4 shows a representation 400 of a signal from the Fp1 or Fp2 electrode (as seen in the electrode positioning system 200 shown in Figure 2) during a series of eye blinks. Similarly, Figure 5 shows a representation 500 of a signal from the T7 or T8 electrode resulting from a series of smiles by a subject.
Figure 6 shows a representation 600 of the signals from each of the electrodes in the headset 102 when various eye movements are performed by the subject. The impact of an up, down, left and right eye movement can be observed from the circled portions of signal representations. Rather than considering the impact upon the EEG signals resulting from facial muscle movements to be an artefact that pollutes the quality of the EEG signals, the apparatus 100 acts to isolate these perturbations and then apply one or more algorithms in order to classify the type of facial muscle movement responsible for producing the perturbations. The apparatus 100 applies at least one facial muscle movement-detection algorithm 1 14 to a portion of the bio-signals captured by the headset 102 affected by a predefined type of facial muscle movement in order to detect facial muscle movements of that predefined type. In order to do so, a mathematical signature defining one or more distinctive characteristics of the predefined facial muscle movement type is stored in the memory device 1 12. The relevant portion of the bio-signals affected by the predefined type of facial muscle movement is then compared to that mathematical signature.
In order to generate the mathematical signature for each facial muscle movement, and as shown in Figure 7, stimuli are developed at step 700 to elicit
that particular facial expression. The stimuli are generally in the form of an audio visual presentation or a set of commands. The set of stimuli is tested at step 702 until a high degree of correlation between the developed stimuli and the resultant desired facial muscle movement is obtained. Once a set of effective stimuli is developed, EEG signal recordings are made at step 704 that contain many examples of the desired facial muscle movements. Ideally, these facial muscle movements should be as natural as possible.
Once the EEG signal recordings are collected, signal processing operations are then performed at step 706 in order to identify one or more distinctive signal characteristics of each predefined facial muscle movement type. Identification of these distinctive signal characteristics in each EEG signal recording enables classification of the facial muscle movement in a subject to be classified at step 708 and an output signal representative of the detected type of facial muscle movement to be output at step 710. Testing and verification of the output signal at step 712 enables a robust data set to be established.
In some embodiments, it may be necessary to develop a mathematical signature for each subject. In other embodiments of the invention, a generic mathematical signature can be developed for each type of facial muscle movement, e.g., using a limited number of subjects, and stored in the memory of the digital signal processor 1 12 without requiring the aforementioned steps to be carried out by each subject.
In one of the modes of operation, the portion of the bio-signals affected by a predefined type of facial muscle movement is predominantly found in signals from a limited number of scalp electrodes. For example, eye movement and blinking can be detected by using only two electrodes near the eyes, such as the Fp1 and Fp2 channels shown in Figure 2. In this case, signals from those electrodes can be directly compared to the mathematical signatures defining the distinctive signal characteristics of the eye blink or other predefined facial muscle movement type.
It is also possible to combine the signals from one or more electrodes together, and then to compare that combined bio-signal to one or more signatures defining the distinctive signal characteristics of predefined facial muscle movement types. A weighting may be applied to each signal prior to the signal combining operation in order to improve the accuracy of the facial muscle movement detection and classification.
In other modes of operation, the apparatus 100 acts to decompose the scalp electrode signals into a series of components and then to compare the projection of the bio-signals from the scalp electrodes onto one or more predetermined component vectors with the mathematical signatures defining the signal characteristics of each type of facial muscle movement.
In this regard, independent component analysis (ICA) has been found to be useful for defining the characteristic forms of the potential function across the entire scalp. Independent component analysis maximizes the degree of statistical independence among outputs using a series of contrast functions. As seen in Figure 8, in ICA, the rows of an input matrix X represent data samples from the bio-signals in the headset 102 recorded at different electrodes whereas the columns are measurements recorded at different time points. Independent component analysis finds an "unmixing" matrix W which decomposes or linearly unmixes the multi-channel scalp data into a sum of temporarily independent and specially fixed components. The rows of the output data matrix U = WX are time courses of activation of the ICA components. The columns of the inverse matrix, W-1 , give the relative projection strength of each of the signals from the scalp electrodes onto respective component vectors. These scalp weights give the scalp topography of each component vector.
Another technique for the decomposition of the bio-signals into components is principal component analysis (PCA) which ensures that output components are uncorrelated. In various embodiments of the invention, either or both of independent component analysis and principal component analysis may be used in order to detect and classify facial muscle movements.
In other modes of operation, the apparatus 100 may act to apply a desired Fourier transform to the bio-signals from the scalp electrodes. The transform could alternatively be a wavelet transform or any other suitable signal transformation method. Combinations of one or more different signal transformation methods may also be used. Portions of the bio-signals affected by a predefined type of facial muscle movement may then be identified using a neural network.
Each of the above described techniques for detection and classification of the facial muscle movements may be incorporated into a facial muscle movement detection algorithm stored in the memory of and performed by the digital signal processor 1 12. Once a particular facial muscle movement detection algorithm has been fully developed, the algorithm may be implemented as a piece of real-time software program or transferred into a digital signal processing or other suitable environment.
As an example of the type of facial muscle movement that can be detected and classified by the apparatus 100, a facial expression algorithm for the detection of an eye blink will now be described. It is to be understood that the general principles described in relation to the algorithm are also applicable to the detection and classification of other types of facial muscle movement, such as winks and eyeball motions. Eye blinks are present in all anterior electrodes but feature most prominently in the two frontal channels Fp1 and Fp2. Figure 9 is a representation 900 of the bio-signal recorded at the scalp electrode Fp1 during 3 typical eye blinks. It can be seen from signal portions 902, 904 and 906 of the bio-signal from the frontal channel Fp1 that each of the 3 eye blinks has a significant effect on the bio-signal. In this example, the projections of the bio-signals from the frontal electrodes Fp1 and Fp2 on predetermined component vectors are used to detect and classify the perturbation in the bio-signals as an eye blink.
In one embodiment of the invention, the predetermined component vectors are identified from historically collected data from a number of subjects and/or across a number of different sessions. As shown in Figure 10, the EEG
data from a number of different subjects and/or across a number of different sessions are recorded at step 1000 when the desired facial muscle movements are being generated by the subjects. At step 1002, independent component analysis is performed on the recorded EEG data and the component vectors onto which are projected the perturbations in the EEG signals resulting from the relevant facial muscle movement are determined at step 1004. The relevant component vectors to be used in subsequent data recording and analysis are then recorded in the storage device 1 12 by facial muscle movement type. In this case, three exemplary types of facial muscle movement are able to be classified, namely vertical eye movement at step 1006, horizontal eye movement at step 1008 and an eye blink at step 1010.
Independent component analysis is a computationally time consuming activity and in many instances is inappropriate in some application, such as real-time use. Whilst independent component analysis may be used to generate average component vectors for use in the detection and classification of various types of facial muscle movements, the balance of signals across different electrodes vary slightly across different sessions and users.
Accordingly, the average component vectors defined using independent component analysis of historically gathered data may not be optimal during realtime data detection and classification. During real-time operation of the apparatus 100, principal component analysis can be performed on the real-time data and the resulting component vector can be used to update the component vector generated by independent component analysis throughout each session. In this way, the resulting facial muscle movement-detection algorithms can be made robust against electrodes shifting and variances in the strengths of electrode contact.
As can be seen at step 1012, the projection of the historically collected data on the vector component is initially used as a reference in the facial muscle movement detection algorithms 1 14. However, as data is collected and stored in the data buffer 108 at step 1014, principal component analysis is carried out at step 1016 on the stored data, and the results of the analysis generated at
step 1018 are then used to update the component vectors developed during offline independent component analysis.
As has been previously described, component vectors can be used in order that a correct weighting is applied to the contribution from the signals of each relevant electrode. An example of an eye-blink component vector is shown in the vector diagram 1 100 in Figure 1 1 . From this diagram it can be seen that the largest contribution to the component is indeed from the two frontal electrodes Fp1 and Fp2. However, it is also apparent that the eye blink is not symmetric. In this case, the potential around the electrode Fp2 is larger than that of the electrode Fp1. The difference may be due to a number of causes, for example, muscle asymmetry, the electrodes not being symmetrically located on the head of a subject or a difference in the electrical impedance contact with the scalp. This diagram illustrates the desirability of optimizing the component vectors during each session, for example by applying the steps illustrated in Figure 10.
Figure 12 shows one example of a facial muscle movement-detection algorithm 1200 used to detect an eye blink. The algorithm 1200 may be applied to the activations of component vectors or alternatively may be applied to signals from individual scalp electrodes or from combinations of signals from more than one scalp electrode. In one embodiment, the projection of the EEG signals onto the component vector associated with an eye blink is initially passed through a low pass filter at step 1202. A first order derivative operation is then performed on the signal at step 1204. In short, the first order derivative of a function / with respect to an infinitesimal change in x is defined as
" l / ,.\ — i:. f(x + h)-f(x) f (x) ≡ lim— — ^-— - . The result of this process for a single eye blink is h→O fo shown in Figure 13. The original component vector is referenced 1300, whereas the signal resulting from the low pass filtering, and from the first order derivative operation are referenced 1302 and 1304 respectively.
Of particular interest are zero-crossing points in the first order derivative signal, which fall into two categories: positive zero-crossing point and negative
zero-crossing point. The sign (namely either positive or negative) of the zero- crossing points indicates whether the signal increases or decreases after crossing the axis. For each eye blink, there are two positive zero-crossing points, respectively referenced 1306 and 1308 on Figure 13. These positive zero-crossing points define boundary conditions of an eye blink. A negative zero-crossing point 1310 defines the peak of the eye blink. Accordingly, the algorithm 1200 determines at step 1206 whether a zero-crossing point occurs in the digitized data stored in data buffer 108. If this is the case, a determination is made at step 1208 if the crossing type is a positive or a negative zero-crossing. If a negative crossing is detected, the peak amplitude of the corresponding signal is checked at step 1210 to verify whether this transitory rise in signal amplitude is from a real eye blink. If a positive zero-crossing point is detected, the algorithm stores this information into state queue at step 1214 in cases where there is no preceding negative zero-crossing point whose corresponding signal amplitude satisfying the peak value condition determined at step 1212 stored in the queue. If there is a preceding negative zero-crossing point stored in the state queue, an assertion that there is an eye blink is made at step 1212. The algorithm resets if the found negative zero-crossing point does not satisfy peak value condition or an eye blink detection assertion is made.
Accordingly, once the zero-crossing points are identified, the algorithm verifies whether there exists a negative zero-crossing point sandwiched between the two positive zero-crossing points, and the eye blink peak passes amplitude threshold. A default value of the amplitude threshold is initially made, but to increase the accuracy of the algorithm, the threshold amplitude is optionally adjusted at step 1218 based upon the strength of an individual's eye blink peaks.
In this example, the eye blink "signature" defines the distinctive signal characteristics representative of an eye blink, namely a negative zero crossing sandwiched between two positive zero crossings in the first order derivative of the filtered signal, and a signal amplitude greater than a predetermined threshold in the filtered signal. The signature is optionally updated by changing the threshold forming part of the distinctive signal characteristics of the
signature during facial muscle movement detection and classification. In other embodiments, the digital signature may define other amplitudes or signal characteristics that exceed one or more predetermined thresholds. The signature may be updated during facial muscle movement detection and classification by changing one or more of those thresholds. More generally, any one or more distinctive signal characteristics of a predetermined facial muscle movement type that form part of a digital signature can be updated during the course of facial muscle movement detection and classification in order to improve the viability and accuracy of the facial muscle movement detection algorithms implemented by the apparatus 100.
The specific channels used to detect and classify various facial expressions may differ according to the particular facial expression in question. In addition to using signals from individual channels or activations of component vectors, the sum or difference of channel pairs may be used in facial muscle movement detection algorithms. Figure 14 shows another example of a facial muscle movement detection algorithm 1400 used to detect an eye blink. The algorithm 1400 involves examining the correlation of signals between channel pairs, as well as the amplitude and gradient of bio-signals from each of the pair of channels Fp1 and Fp2, as well as the sum of bio-signals from that pair of channels. In this exemplary algorithm, at step 1402, bio-signals from the sensors shown in Figure 2 are sampled at a rate of 256 Hz resulting in 256 samples per second being generated for each of the 32 channels. At step 1404, channels 1 and 30 corresponding respectively to bio-signals from the sensors Fp1 and Fp2 are extracted for a data window corresponding to one over 32 of a second containing 8 samples. At step 1406, samples from channels 1 and 30 are summed.
At step 1408, a third order infinite impulse response (NR) low pass filter is applied at 10 Hz, whilst at step 1410 a first order NR high pass filter is applied at 0.125 Hz.
At step 1412, a first order derivative operation is performed on the sum of bio-signals from channels Fp1 and Fp2. Similar to the aforementioned
algorithm 1200, an eye blink peak is tracked by the negative zero-crossing point of the first order derivative. The rise and the fall of an eye blink signal peak are bounded by positive zero-crossing points preceding and following this negative zero-crossing point of the first order derivative, respectively. An assessment is made as to whether the correlation between the filtered signals of channels Fp1 and Fp2 for the window of data bounded by a positive zero-crossing and a negative zero-crossing for the rise of a peak (and vice versa for the fall of a peak) exceeds a first predetermined threshold at step 1414, whether the lesser amplitude of the rise or the fall of an eye blink peak signal from either of the individual channels Fp1 and Fp2 exceeds a second predetermined threshold at step 1416, and whether the maximum gradient determined from the peak or trough of the first order derivative of the summed signals from channels Fp1 and Fp2 exceeds a third predetermined threshold at step 1418. If these three values are above their respective thresholds in all cases, then an eye opening or eye closing event is detected at step 1420 (dependent on whether the maximum gradient is positive or negative).
Similar algorithms can be used to identify winks, eyeball motions or other related facial muscle expressions. Other algorithms may use different combinations of signal correlation, amplitude displacement and signal gradient measurement, as well as assessing the sum or difference of bio-signals from one or more different channel pairs to those used in the algorithm illustrated in Figure 14.
Other algorithms used to detect and classify facial muscle movements rely upon the determination of signal power upon particular channels, the sum of signal power on one or more pairs of signal channels, the difference of signal power between one or more pairs of channels and/or the ratio of the signal power on one or more channels or one or more channel pairs to the power of signals on one or more other channels or one or more other channel pairs. By using the exemplary algorithm 1500 shown in Figure 15, either a smile or clench can be detected. This exemplary algorithm determines a metric that is calculated by using the signal power on particular channels, signal power on the sum of bio-signals on particular channels and ratios of the sums of signal power
on signal channel pairs. At step 1502, the bio-signals from each of the detectors shown in Figure 2 are sampled at 256 Hz. At step 1504, data from several channel pairs is extracted, namely data from channel 6 and 1 1 corresponding to detectors T7 and T8, channels 2 and 15 corresponding to detectors AF3 and AF4, channels 4 and 13 corresponding to detectors FC5 and FC6 and channels 3 and 14 corresponding to detectors F7 and F8.
At step 1506, a data window is created for each of these channels in which 64 consecutive samples (corresponding to a quarter of a second) are considered. At step 1508, the power of the signal represented by the 64 consecutive samples in the data window for each channel is calculated, and a single power value computed for each channel at step 1510.
In order to determine whether a smile is present in the EEG data sample in step 1502, at step 1512 the ratio of the power on channel pair T7 and T8 to the average of the power on channel pair AF3 and AF4 and channel pair FC5 and FC6 is computed. At step 1514, a low pass filter is applied to the computation carried out at step 1512. At step 1516, a determination is made as to whether the ratio computed at step 1512 exceeds a predetermined threshold indicative of a smile. If this is the case, then a smile is detected at step 1518, or otherwise step 1506 is performed again with the next 64 data samples for each of the channels extracted at step 1504.
Whilst step 1512 is being computed in order to determine whether a smile is present in the bio-signal data, at step 1522 the average power on channel pair FC5 and FC6 is being computed in order to determine whether a clench is present in the bio-signals. If it is determined at step 1526 that this computed average power exceeds a predetermined threshold indicative of a clench, then a clench is detected at step 1528. If the average power computed in step 1522 does not exceed the predetermined threshold indicative of a clench at step 1526, then the power ratio computed at step 1512 is compared to the threshold indicative of a smile in order to determine whether a smile is present in the bio-signals.
In order to improve the efficiency of algorithms relying upon signal power detection, a signal power profile can be created using signal power on all channels to create a signal power profile as a 32 channel vector. There are several ways of creating a signal power profile for an expression. For example, a combination of principle component analysis, simple statistics such as mean, median and manual inspection can be used to create the signal power profile. The profile can then be normalized to a unit vector for scaling simplicity. The dot product of the signal power profile and the normalized signal power on all channels can then be used as the signal to identify when a particular facial expression occurs.
In the algorithm described in relation to Figure 15, signal power on a channel is calculated by taking the first order derivative of the channel signal, which is sometimes firstly passed through a high pass filter. The absolute value of the first order derivative of a channel signal is then taken, and some fraction of the lowest and highest values is discarded. These values may typically be 3/8 of the lowest and the 1/8 of the highest values. The mean of the non- discarded portion is then calculated. The data window used to calculate the signal power is normally in the order of 1A of a second.
Alternative methods of calculating signal power can be used in other embodiments of the invention. These methods may be based upon signal power present in particular frequency bands on channels, or the ratio of power in two different frequency bands or two different channels. These could be different channels on the same band, different bands on the same channel or different bands on different channels.
The channel correlation performed in relation to the algorithm shown in Figure 14 can be calculated according to the expression sum (xy)/sqrt(sum(x2)+sum(y2)). In some embodiments of the invention however, an alternative measure of correlation is carried out: sum(xy)/sqrt(sum(max(x,y)2)). In both of the two preceding expressions, the terms x and y correspond to the two signals being calculated. This latter correlation method has the advantage of reducing the value in two signals that have similar profiles, but one is very
much larger than the other.
The facial expression detection algorithms described above can have multiple threshold values associated therewith. These threshold values may not be intuitive to adjust and it is therefore useful to be able to translate these thresholds into one or more intuitive "sensitivity" parameters.
For example, the effect that an eye blink has on the EEG signal from any one or more of the detectors shown in Figure 2 can be simply understood as concurrent upward deflections on a number of frontal channels. These deflections may be characterized by the minimum height of the deflection, the minimum gradients of the deflection and the minimum correlations of the signals on different channels. How these three thresholds are combined is not necessarily straightforward to the non-expert user. In order to convert the multiple thresholds to a single sensitivity parameter, it may be desirable to evaluate a reasonable range for each threshold. Each threshold may then be interpolated, either linearly or otherwise, between the minimum and maximum values of the reasonable range. Accordingly, if the minimum threshold value for an individual parameter is thresh_min and the maximum is thresh max. Given a sensitivity parameter of s that varies on a range of 0 to 1 and using linear interpolation, the threshold value for that parameter would be:
threshold = thresh_max+ (thresh_min- thresh max)* s
Additionally the sensitivity threshold may be inferred by any of the individual thresholds based on:
s = (threshold- thresh_max)/(thresh_min-thresh_max)
Due to the variance in musculature of different people, the expressions detected via noise profiling may have very different values. Automatic calibration of these algorithms can be performed to cater for this variability. Calibration can be performed by recording a "neutral" state, defined as anything but the expressions that are being calibrated. Noise values are calculated for this
period, the values obtained are sorted and the lower 50% of the values are discarded. The average of the remaining values is then used as a baseline above which we can assume that there is some small amount of expression present.
Calibration of the expression's maximum threshold can then be done by using a multiple of the lower threshold, for example 2 or 3 times the lower threshold. Alternatively the subject can be asked to perform the expression, and the values obtained during this period can be used to determine upper end of the facial expression range. Care should be taken as the perturbations caused by forced expressions may not be as large as naturally occurring expressions, so the maximum value found during such a calibration should be set to 50% of the range.
Although the present invention has been discussed in considerable detail with reference to certain preferred embodiments, other embodiments are possible. Therefore, the scope of the appended claims should not be limited to the description of preferred embodiments contained in this disclosure. All references cited herein are incorporated by reference in their entirety.
Claims
1 . A method of detecting and classifying facial muscle movements, including the steps of: a) receiving bio-signals from one or more than one bio-signal detector; and b) applying one or more than one facial muscle movement-detection algorithm to a portion of the bio-signals affected by a predefined type of facial muscle movement in order to detect the facial muscle movements of the predefined type.
2. The method according to claim 1 , wherein the step of applying one or more than one facial muscle movement-detection algorithm to the bio- signals includes comparing the bio-signal portion to a signature defining one or more than one distinctive signal characteristics of the predefined facial muscle movement type.
3. The method according to claim 2, wherein the step of applying one or more than one facial muscle movement-detection algorithm to the bio- signals includes directly comparing bio-signals from one or more than one predetermined bio-signal detectors to that signature.
4. The method according to claim 2, wherein the step of applying one or more than one facial muscle movement-detection algorithm to the bio- signals includes: a) projecting bio-signals from the plurality of bio-signal detectors on one or more than one predetermined component vectors; and b) comparing the projection of the bio-signals onto one or more than one component vectors to that signature.
5. The method according to claim 4, further including applying a desired transform to the projected bio-signal after the projection of the bio- signals from the plurality of detectors on one or more than one component vectors, and before the projected bio-signal is compared to that signature.
6. The method according to claim 4, wherein the predetermined component vectors are determined by applying a first component analysis to historically collected bio-signals generated during facial muscle movement types of the type corresponding to that signature.
7. The method according to claim 6, wherein the first component analysis applied to the historically collected bio-signals is independent component analysis (ICA).
8. The method according to claim 6, wherein the first component analysis applied to the historically collected bio-signals is principal component analysis (PCA).
9. The method according to claim 4, wherein the one or more than one component vectors are updated during facial muscle movement-detection and classification.
10. The method according to claim 2, further including updating the signature during the course of facial muscle movement-detection and classification.
1 1 . The method according to claim 10, wherein the signature is updated by changing thresholds forming at least part of the distinctive signal characteristics of the signature.
12. The method according to claim 2, wherein the step of applying one or more than one facial muscle movement-detection algorithm to the bio- signals includes: a) applying a desired transform to the bio-signals; and b) comparing the results of the desired transform to that signature.
13. The method according to claims 12, wherein the transform is one or more than one transform selected from the group consisting of a Fourier transform and a wavelet transform.
14. A method according to claim 4 further including: a) applying a second component analysis to the detected bio-signals; and b) using the results of the second component analysis to update the one or more than one predetermined component vectors during bio-signal detection.
15. The method according to claim 14, wherein the second component analysis is principal component analysis (PCA).
16. The method according to claim 1 , wherein the step of applying one or more than one facial muscle movement-detection algorithm to the bio- signals includes separating the bio-signals resulting from the predefined type of facial muscle movement from one or more than one sources of noise in the bio- signals.
17. The method according to claim 16, wherein the sources of noise comprise one or more than one source selected from the group consisting of electromagnetic interference (EMI), and bio-signals not resulting from the predefined type of facial muscle movement.
18. The method according to claim 2, wherein the step of applying one or more than one facial muscle movement-detection algorithm to the bio- signals includes comparing the sum or difference of bio-signals from one or more pairs of bio-signal detectors to that signature.
19. The method of claim 18, wherein the step of applying one or more than one facial muscle movement-detection algorithm to the bio-signals further includes comparing bio-signals from each of the one or more pairs of bio-signal detectors to that signature.
20. The method of claim 19, wherein the comparing step includes: tracking a derivative of one or more than one of the bio-signals from each of the one or more pairs of bio-signal detectors and the sum or difference of bio- signals from the one or more pairs of bio-signal detectors.
21 . The method of claim 20, wherein the comparing step further includes: comparing one or both of gradient and amplitude for one or more than one of the bio-signals from each of the one or more pairs of bio-signal detectors and the sum or difference of bio-signals from the one or more pairs of bio-signal detectors; and determining when one or both of the gradient and amplitude respectively exceeds predetermined gradient and amplitude thresholds.
22. The method of either of claims 20 or 21 , wherein the comparing step further includes: computing the correlation between bio-signals from each of the one or more pairs of bio-signal detectors; and determining when the correlation exceeds a predetermined correlation threshold.
23. The method according to claim 2, wherein the step of applying one or more than one facial muscle movement-detection algorithm to the bio- signals includes comparing the power of bio-signals from one or more predetermined bio-signal detector to that signature.
24. The method according to claim 23, wherein the comparing step includes summing the power of bio-signals from one or more pairs of bio-signal detectors to that signature; and determining whether the sum exceeds a predetermined threshold indicative of a first facial muscle movement type.
25. The method according to claim 23, wherein the comparing step includes computing the ratio of the power of bio-signals from a first group of bio- signal detectors to the power of bio-signals from a second group of bio-signal detectors; and determining whether the ratio exceeds a predetermined threshold indicative of a second facial muscle movement type.
26. The method according to claim 1 , wherein the bio-signals include electroencephalograph (EEG) signals.
27. The method according to claim 1 , further including generating an output signal representative of the detected facial muscle movement type.
28. An apparatus for detecting and classifying facial muscle movements, including: a processor and associated memory device for causing the processor to carry out a method according to any one of claims 1 to 27.
29. A computer program product, tangibly stored on machine readable medium, the product comprising instructions operable to cause a processor to carry out a method according to any one of claims 1 to 27.
30. A computer program product comprising instructions operable to cause a processor to carry out a method according to any one of claims 1 to 27.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP06774959A EP1934677A4 (en) | 2005-09-12 | 2006-09-12 | Method and system for detecting and classifying facial muscle movements |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/225,598 US20070060830A1 (en) | 2005-09-12 | 2005-09-12 | Method and system for detecting and classifying facial muscle movements |
US11/225,598 | 2005-09-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007030868A1 true WO2007030868A1 (en) | 2007-03-22 |
Family
ID=37856224
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU2006/001331 WO2007030868A1 (en) | 2005-09-12 | 2006-09-12 | Method and system for detecting and classifying facial muscle movements |
Country Status (5)
Country | Link |
---|---|
US (2) | US20070060830A1 (en) |
EP (1) | EP1934677A4 (en) |
CN (1) | CN101310242A (en) |
TW (1) | TW200729014A (en) |
WO (1) | WO2007030868A1 (en) |
Families Citing this family (91)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060257834A1 (en) * | 2005-05-10 | 2006-11-16 | Lee Linda M | Quantitative EEG as an identifier of learning modality |
CA2606870C (en) * | 2005-05-16 | 2017-06-27 | Cerebral Diagnostics Canada Incorporated | Near-real time three-dimensional localization, display, recording, and analysis of electrical activity in the cerebral cortex |
WO2007030275A2 (en) * | 2005-09-02 | 2007-03-15 | Emsense Corporation | A device and method for sensing electrical activity in tissue |
US20070060830A1 (en) * | 2005-09-12 | 2007-03-15 | Le Tan Thi T | Method and system for detecting and classifying facial muscle movements |
US8230457B2 (en) | 2007-03-07 | 2012-07-24 | The Nielsen Company (Us), Llc. | Method and system for using coherence of biological responses as a measure of performance of a media |
US9215996B2 (en) * | 2007-03-02 | 2015-12-22 | The Nielsen Company (Us), Llc | Apparatus and method for objectively determining human response to media |
US20090070798A1 (en) * | 2007-03-02 | 2009-03-12 | Lee Hans C | System and Method for Detecting Viewer Attention to Media Delivery Devices |
US20090253996A1 (en) * | 2007-03-02 | 2009-10-08 | Lee Michael J | Integrated Sensor Headset |
US20080218472A1 (en) * | 2007-03-05 | 2008-09-11 | Emotiv Systems Pty., Ltd. | Interface to convert mental states and facial expressions to application input |
US8473044B2 (en) * | 2007-03-07 | 2013-06-25 | The Nielsen Company (Us), Llc | Method and system for measuring and ranking a positive or negative response to audiovisual or interactive media, products or activities using physiological signals |
US20080221969A1 (en) * | 2007-03-07 | 2008-09-11 | Emsense Corporation | Method And System For Measuring And Ranking A "Thought" Response To Audiovisual Or Interactive Media, Products Or Activities Using Physiological Signals |
US8764652B2 (en) * | 2007-03-08 | 2014-07-01 | The Nielson Company (US), LLC. | Method and system for measuring and ranking an “engagement” response to audiovisual or interactive media, products, or activities using physiological signals |
US8782681B2 (en) * | 2007-03-08 | 2014-07-15 | The Nielsen Company (Us), Llc | Method and system for rating media and events in media based on physiological data |
WO2008121651A1 (en) | 2007-03-29 | 2008-10-09 | Neurofocus, Inc. | Analysis of marketing and entertainment effectiveness |
US9886981B2 (en) | 2007-05-01 | 2018-02-06 | The Nielsen Company (Us), Llc | Neuro-feedback based stimulus compression device |
US8392253B2 (en) | 2007-05-16 | 2013-03-05 | The Nielsen Company (Us), Llc | Neuro-physiology and neuro-behavioral based stimulus targeting system |
US8533042B2 (en) | 2007-07-30 | 2013-09-10 | The Nielsen Company (Us), Llc | Neuro-response stimulus and stimulus attribute resonance estimator |
US8386313B2 (en) | 2007-08-28 | 2013-02-26 | The Nielsen Company (Us), Llc | Stimulus placement system using subject neuro-response measurements |
US8392255B2 (en) | 2007-08-29 | 2013-03-05 | The Nielsen Company (Us), Llc | Content based selection and meta tagging of advertisement breaks |
US8376952B2 (en) * | 2007-09-07 | 2013-02-19 | The Nielsen Company (Us), Llc. | Method and apparatus for sensing blood oxygen |
US20090083129A1 (en) | 2007-09-20 | 2009-03-26 | Neurofocus, Inc. | Personalized content delivery using neuro-response priming data |
US8327395B2 (en) * | 2007-10-02 | 2012-12-04 | The Nielsen Company (Us), Llc | System providing actionable insights based on physiological responses from viewers of media |
CN101917898A (en) | 2007-10-31 | 2010-12-15 | 埃姆申塞公司 | Physiological responses from spectators is provided the system and method for distributed collection and centralized processing |
WO2009073634A1 (en) * | 2007-11-30 | 2009-06-11 | Emsense Corporation | Correlating media instance information with physiological responses from participating subjects |
US8347326B2 (en) | 2007-12-18 | 2013-01-01 | The Nielsen Company (US) | Identifying key media events and modeling causal relationships between key events and reported feelings |
US8326408B2 (en) | 2008-06-18 | 2012-12-04 | Green George H | Method and apparatus of neurological feedback systems to control physical objects for therapeutic and other reasons |
WO2009154480A1 (en) | 2008-06-20 | 2009-12-23 | Business Intelligence Solutions Safe B.V. | A method of graphically representing a tree structure |
US20100016753A1 (en) * | 2008-07-18 | 2010-01-21 | Firlik Katrina S | Systems and Methods for Portable Neurofeedback |
US10192389B2 (en) | 2008-09-01 | 2019-01-29 | New Bis Safe Luxco S.À.R.L. | Methods, apparatus and systems for determining an adjustment value of a gaming device |
CN102099770B (en) * | 2009-01-19 | 2014-01-08 | 松下电器产业株式会社 | Activation device, method, and computer program for brain wave interface system |
US20100250325A1 (en) | 2009-03-24 | 2010-09-30 | Neurofocus, Inc. | Neurological profiles for market matching and stimulus presentation |
WO2011007569A1 (en) * | 2009-07-15 | 2011-01-20 | 国立大学法人筑波大学 | Classification estimating system and classification estimating program |
US10987015B2 (en) | 2009-08-24 | 2021-04-27 | Nielsen Consumer Llc | Dry electrodes for electroencephalography |
US20110106750A1 (en) | 2009-10-29 | 2011-05-05 | Neurofocus, Inc. | Generating ratings predictions using neuro-response data |
US9560984B2 (en) | 2009-10-29 | 2017-02-07 | The Nielsen Company (Us), Llc | Analysis of controlled and automatic attention for introduction of stimulus material |
US8364255B2 (en) * | 2010-03-10 | 2013-01-29 | Brainscope Company, Inc. | Method and device for removing EEG artifacts |
US8684742B2 (en) | 2010-04-19 | 2014-04-01 | Innerscope Research, Inc. | Short imagery task (SIT) research method |
US8655428B2 (en) | 2010-05-12 | 2014-02-18 | The Nielsen Company (Us), Llc | Neuro-response data synchronization |
US11318949B2 (en) * | 2010-06-07 | 2022-05-03 | Affectiva, Inc. | In-vehicle drowsiness analysis using blink rate |
US10482333B1 (en) * | 2017-01-04 | 2019-11-19 | Affectiva, Inc. | Mental state analysis using blink rate within vehicles |
WO2012012755A2 (en) * | 2010-07-22 | 2012-01-26 | Washington University | Correlating frequency signatures to cognitive processes |
WO2012049362A1 (en) * | 2010-10-13 | 2012-04-19 | Aalto University Foundation | A projection method and system for removing muscle artifacts from signals based on their frequency bands and topographies |
KR101208719B1 (en) | 2011-01-07 | 2012-12-06 | 동명대학교산학협력단 | System for processing biological signal and portable instrumnet for processing biological signal |
WO2012116232A1 (en) * | 2011-02-23 | 2012-08-30 | University Of Utah Research Foundation | Systems and methods for decoding neural signals |
WO2012125596A2 (en) | 2011-03-12 | 2012-09-20 | Parshionikar Uday | Multipurpose controller for electronic devices, facial expressions management and drowsiness detection |
US9830507B2 (en) | 2011-03-28 | 2017-11-28 | Nokia Technologies Oy | Method and apparatus for detecting facial changes |
CN102525453B (en) * | 2012-02-15 | 2014-03-19 | 南京伟思医疗科技有限责任公司 | Electroencephalogram detection device and method |
US9569986B2 (en) | 2012-02-27 | 2017-02-14 | The Nielsen Company (Us), Llc | System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications |
US9451303B2 (en) | 2012-02-27 | 2016-09-20 | The Nielsen Company (Us), Llc | Method and system for gathering and computing an audience's neurologically-based reactions in a distributed framework involving remote storage and computing |
US9292858B2 (en) | 2012-02-27 | 2016-03-22 | The Nielsen Company (Us), Llc | Data collection system for aggregating biologically based measures in asynchronous geographically distributed public environments |
US20160232811A9 (en) * | 2012-06-14 | 2016-08-11 | Robert A. Connor | Eyewear System for Monitoring and Modifying Nutritional Intake |
US20140350353A1 (en) * | 2013-05-27 | 2014-11-27 | Robert A. Connor | Wearable Imaging Device for Monitoring Food Consumption using Gesture Recognition |
US10130277B2 (en) * | 2014-01-28 | 2018-11-20 | Medibotics Llc | Willpower glasses (TM)—a wearable food consumption monitor |
GB201211703D0 (en) | 2012-07-02 | 2012-08-15 | Charles Nduka Plastic Surgery Ltd | Biofeedback system |
US8989835B2 (en) | 2012-08-17 | 2015-03-24 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
US9754336B2 (en) | 2013-01-18 | 2017-09-05 | The Medical Innovators Collaborative | Gesture-based communication systems and methods for communicating with healthcare personnel |
WO2014152630A1 (en) * | 2013-03-14 | 2014-09-25 | Julian Michael Urbach | Eye piece for augmented and virtual reality |
US9773332B2 (en) | 2013-03-14 | 2017-09-26 | Otoy, Inc. | Visual cortex thought detector interface |
US9320450B2 (en) | 2013-03-14 | 2016-04-26 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US9235968B2 (en) | 2013-03-14 | 2016-01-12 | Otoy, Inc. | Tactile elements for a wearable eye piece |
US9141851B2 (en) * | 2013-06-28 | 2015-09-22 | Qualcomm Incorporated | Deformable expression detector |
US9622702B2 (en) | 2014-04-03 | 2017-04-18 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US10121063B2 (en) * | 2015-01-12 | 2018-11-06 | BMT Business Meets Technology Holding AG | Wink gesture based control system |
EP3270766B1 (en) * | 2015-03-18 | 2023-11-15 | T&W Engineering A/S | Eeg monitor |
CN106137207A (en) * | 2015-04-03 | 2016-11-23 | 北京智谷睿拓技术服务有限公司 | Feeding action information determines method and apparatus |
US9936250B2 (en) | 2015-05-19 | 2018-04-03 | The Nielsen Company (Us), Llc | Methods and apparatus to adjust content presented to an individual |
CN106681484B (en) * | 2015-11-06 | 2019-06-25 | 北京师范大学 | In conjunction with the image object segmenting system of eye-tracking |
CN105662336B (en) * | 2015-12-23 | 2019-03-19 | 黑龙江科技大学 | A kind of signal denoising processing method and processing device |
US10515474B2 (en) | 2017-01-19 | 2019-12-24 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression in a virtual reality system |
EP3571627A2 (en) * | 2017-01-19 | 2019-11-27 | Mindmaze Holding S.A. | Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location including for at least one of a virtual and augmented reality system |
US10943100B2 (en) | 2017-01-19 | 2021-03-09 | Mindmaze Holding Sa | Systems, methods, devices and apparatuses for detecting facial expression |
GB2561537B (en) | 2017-02-27 | 2022-10-12 | Emteq Ltd | Optical expression detection |
EP3672478A4 (en) | 2017-08-23 | 2021-05-19 | Neurable Inc. | Brain-computer interface with high-speed eye tracking features |
WO2019060298A1 (en) | 2017-09-19 | 2019-03-28 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement |
EP3697297A4 (en) * | 2017-10-19 | 2020-12-16 | Facebook Technologies, Inc. | Systems and methods for identifying biological structures associated with neuromuscular source signals |
US20200329991A1 (en) * | 2017-10-20 | 2020-10-22 | Panasonic Corporation | Electroencephalogram measurement system, electroencephalogram measurement method, program, and non-transitory storage medium |
US11717686B2 (en) | 2017-12-04 | 2023-08-08 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to facilitate learning and performance |
EP3731749A4 (en) | 2017-12-31 | 2022-07-27 | Neuroenhancement Lab, LLC | System and method for neuroenhancement to enhance emotional response |
US11328533B1 (en) | 2018-01-09 | 2022-05-10 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression for motion capture |
US11907423B2 (en) | 2019-11-25 | 2024-02-20 | Meta Platforms Technologies, Llc | Systems and methods for contextualized interactions with an environment |
US11364361B2 (en) | 2018-04-20 | 2022-06-21 | Neuroenhancement Lab, LLC | System and method for inducing sleep by transplanting mental states |
CA3112564A1 (en) | 2018-09-14 | 2020-03-19 | Neuroenhancement Lab, LLC | System and method of improving sleep |
US10664050B2 (en) | 2018-09-21 | 2020-05-26 | Neurable Inc. | Human-computer interface using high-speed and accurate tracking of user interactions |
CN113423341A (en) | 2018-11-27 | 2021-09-21 | 脸谱科技有限责任公司 | Method and apparatus for automatic calibration of wearable electrode sensor system |
US11395615B2 (en) * | 2019-04-17 | 2022-07-26 | Bose Corporation | Fatigue and drowsiness detection |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
CN110321807A (en) * | 2019-06-13 | 2019-10-11 | 南京行者易智能交通科技有限公司 | A kind of convolutional neural networks based on multilayer feature fusion are yawned Activity recognition method and device |
CN110739042A (en) * | 2019-10-29 | 2020-01-31 | 浙江迈联医疗科技有限公司 | Limb movement rehabilitation method and device based on brain-computer interface, storage medium and equipment |
CN111956217B (en) * | 2020-07-15 | 2022-06-24 | 山东师范大学 | Blink artifact identification method and system for real-time electroencephalogram signals |
CN113855019B (en) * | 2021-08-25 | 2023-12-29 | 杭州回车电子科技有限公司 | Expression recognition method and device based on EOG (Ethernet over coax), EMG (electro-magnetic resonance imaging) and piezoelectric signals |
WO2024042530A1 (en) * | 2022-08-24 | 2024-02-29 | X-Trodes Ltd | Method and system for electrophysiological determination of a behavioral activity |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1997033515A1 (en) * | 1996-03-15 | 1997-09-18 | Kenneth Michael Zawilinski | Emotional response analyzer system with multimedia display |
WO2001007128A1 (en) * | 1999-07-24 | 2001-02-01 | Korea Research Institute Of Jungshin Science | Game device using brain waves and gaming method therefor |
US6422999B1 (en) * | 1999-05-13 | 2002-07-23 | Daniel A. Hill | Method of measuring consumer reaction |
US20030032890A1 (en) * | 2001-07-12 | 2003-02-13 | Hazlett Richard L. | Continuous emotional response analysis with facial EMG |
WO2004037086A1 (en) * | 2002-10-23 | 2004-05-06 | Daimlerchrysler Ag | Method for optimising and recording product attractiveness or product acceptance by observing cerebral activity |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5195531A (en) * | 1991-03-01 | 1993-03-23 | Bennett Henry L | Anesthesia adequacy monitor and method |
US5724987A (en) * | 1991-09-26 | 1998-03-10 | Sam Technology, Inc. | Neurocognitive adaptive computer-aided training method and system |
US6349231B1 (en) * | 1994-01-12 | 2002-02-19 | Brain Functions Laboratory, Inc. | Method and apparatus for will determination and bio-signal control |
JP3310498B2 (en) * | 1994-09-02 | 2002-08-05 | 独立行政法人産業技術総合研究所 | Biological information analyzer and biological information analysis method |
US6233472B1 (en) * | 1995-06-06 | 2001-05-15 | Patient Comfort, L.L.C. | Electrode assembly and method for signaling a monitor |
US6001065A (en) * | 1995-08-02 | 1999-12-14 | Ibva Technologies, Inc. | Method and apparatus for measuring and analyzing physiological signals for active or passive control of physical and virtual spaces and the contents therein |
US5740812A (en) * | 1996-01-25 | 1998-04-21 | Mindwaves, Ltd. | Apparatus for and method of providing brainwave biofeedback |
US6292688B1 (en) * | 1996-02-28 | 2001-09-18 | Advanced Neurotechnologies, Inc. | Method and apparatus for analyzing neurological response to emotion-inducing stimuli |
US5899867A (en) * | 1996-10-11 | 1999-05-04 | Collura; Thomas F. | System for self-administration of electroencephalographic (EEG) neurofeedback training |
US6121953A (en) * | 1997-02-06 | 2000-09-19 | Modern Cartoons, Ltd. | Virtual reality system for sensing facial movements |
US6097981A (en) * | 1997-04-30 | 2000-08-01 | Unique Logic And Technology, Inc. | Electroencephalograph based biofeedback system and method |
KR100281650B1 (en) * | 1997-11-13 | 2001-02-15 | 정선종 | EEG analysis method for discrimination of positive / negative emotional state |
US6609017B1 (en) * | 1998-08-07 | 2003-08-19 | California Institute Of Technology | Processed neural signals and methods for generating and using them |
US6594632B1 (en) * | 1998-11-02 | 2003-07-15 | Ncr Corporation | Methods and apparatus for hands-free operation of a voice recognition system |
EP1139240A3 (en) * | 2000-03-28 | 2003-11-05 | Kenji Mimura | Design method and design evaluation method, and equipment thereof |
LU90582B1 (en) * | 2000-05-16 | 2001-11-19 | Europ Economic Community | System for identifying brain activity |
WO2002039371A2 (en) * | 2000-11-03 | 2002-05-16 | Koninklijke Philips Electronics N.V. | Estimation of facial expression intensity using a bidirectional star topology hidden markov model |
US7027621B1 (en) * | 2001-03-15 | 2006-04-11 | Mikos, Ltd. | Method and apparatus for operator condition monitoring and assessment |
EP1401330A4 (en) * | 2001-06-07 | 2005-04-06 | Lawrence Farwell | Method and apparatus for brain fingerprinting, measurement, assessment and analysis of brain function |
DE60216411T2 (en) * | 2001-08-23 | 2007-10-04 | Sony Corp. | ROBOT DEVICE, FACE DETECTION METHOD AND FACIAL DETECTION DEVICE |
DE10149049A1 (en) * | 2001-10-05 | 2003-04-17 | Neuroxx Gmbh | Method for creating and modifying virtual biological representation of computer application user, requires forming virtual biological representation of user in computer application |
JP3813552B2 (en) * | 2002-07-22 | 2006-08-23 | 横浜ゴム株式会社 | Work stress determination device, work stress determination program, and work stress determination method |
GB2395780A (en) * | 2002-11-29 | 2004-06-02 | Sony Uk Ltd | Face detection |
US7546158B2 (en) * | 2003-06-05 | 2009-06-09 | The Regents Of The University Of California | Communication methods based on brain computer interfaces |
US7388971B2 (en) * | 2003-10-23 | 2008-06-17 | Northrop Grumman Corporation | Robust and low cost optical system for sensing stress, emotion and deception in human subjects |
US7120486B2 (en) * | 2003-12-12 | 2006-10-10 | Washington University | Brain computer interface |
JP2006006355A (en) * | 2004-06-22 | 2006-01-12 | Sony Corp | Processor for biological information and video and sound reproducing device |
US9820658B2 (en) * | 2006-06-30 | 2017-11-21 | Bao Q. Tran | Systems and methods for providing interoperability among healthcare devices |
US20070060830A1 (en) * | 2005-09-12 | 2007-03-15 | Le Tan Thi T | Method and system for detecting and classifying facial muscle movements |
-
2005
- 2005-09-12 US US11/225,598 patent/US20070060830A1/en not_active Abandoned
-
2006
- 2006-09-12 EP EP06774959A patent/EP1934677A4/en not_active Withdrawn
- 2006-09-12 CN CNA2006800415840A patent/CN101310242A/en active Pending
- 2006-09-12 TW TW095133790A patent/TW200729014A/en unknown
- 2006-09-12 WO PCT/AU2006/001331 patent/WO2007030868A1/en active Application Filing
- 2006-09-12 US US11/531,117 patent/US20070179396A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1997033515A1 (en) * | 1996-03-15 | 1997-09-18 | Kenneth Michael Zawilinski | Emotional response analyzer system with multimedia display |
US6422999B1 (en) * | 1999-05-13 | 2002-07-23 | Daniel A. Hill | Method of measuring consumer reaction |
WO2001007128A1 (en) * | 1999-07-24 | 2001-02-01 | Korea Research Institute Of Jungshin Science | Game device using brain waves and gaming method therefor |
US20030032890A1 (en) * | 2001-07-12 | 2003-02-13 | Hazlett Richard L. | Continuous emotional response analysis with facial EMG |
WO2004037086A1 (en) * | 2002-10-23 | 2004-05-06 | Daimlerchrysler Ag | Method for optimising and recording product attractiveness or product acceptance by observing cerebral activity |
Non-Patent Citations (1)
Title |
---|
See also references of EP1934677A4 * |
Also Published As
Publication number | Publication date |
---|---|
US20070060830A1 (en) | 2007-03-15 |
US20070179396A1 (en) | 2007-08-02 |
EP1934677A1 (en) | 2008-06-25 |
TW200729014A (en) | 2007-08-01 |
EP1934677A4 (en) | 2009-12-09 |
CN101310242A (en) | 2008-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20070179396A1 (en) | Method and System for Detecting and Classifying Facial Muscle Movements | |
Butkevičiūtė et al. | Removal of movement artefact for mobile EEG analysis in sports exercises | |
Parra et al. | Response error correction-a demonstration of improved human-machine performance using real-time EEG monitoring | |
Wang et al. | Common spatial pattern method for channel selelction in motor imagery based brain-computer interface | |
US9211078B2 (en) | Process and device for brain computer interface | |
Krusienski et al. | An evaluation of autoregressive spectral estimation model order for brain-computer interface applications | |
US20070066914A1 (en) | Method and System for Detecting and Classifying Mental States | |
US20120226185A1 (en) | Readiness potential-based brain-computer interface device and method | |
JPS63226340A (en) | Method and apparatus for displaying timewise relation between position and internal area of brain nerve activity | |
KR20190030612A (en) | System for providing subject-independent brain-computer interface and method thereof | |
CN110135285B (en) | Electroencephalogram resting state identity authentication method and device using single-lead equipment | |
Mucarquer et al. | Improving EEG muscle artifact removal with an EMG array | |
US20190034797A1 (en) | Data generation apparatus, biological data measurement system, classifier generation apparatus, data generation method, classifier generation method, and recording medium | |
Fickling et al. | Good data? The EEG quality index for automated assessment of signal quality | |
Sarin et al. | Automated ocular artifacts identification and removal from EEG data using hybrid machine learning methods | |
Islam et al. | Probability mapping based artifact detection and wavelet denoising based artifact removal from scalp EEG for BCI applications | |
Dzitac et al. | Identification of ERD using fuzzy inference systems for brain-computer interface | |
Cososchi et al. | EEG features extraction for motor imagery | |
Bin et al. | A study of informative EEG channel and brain region for typing activity | |
Dhiman et al. | Artifact removal from eeg recordings–an overview | |
Yi et al. | Evaluation of mental workload associated with time pressure in rapid serial visual presentation tasks | |
Paulraj et al. | Fractal feature based detection of muscular and ocular artifacts in EEG signals | |
KR20140009635A (en) | Extracting erd/ers feature using pso | |
Kanoga et al. | Semi-simulation experiments for quantifying the performance of SSVEP-based BCI after reducing artifacts from trapezius muscles | |
Yong et al. | Automatic artefact detection in a self-paced brain-computer interface system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680041584.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006774959 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2006774959 Country of ref document: EP |