Biomedical Signal Processing and Control 56 (2020) 101687
Contents lists available at ScienceDirect
Biomedical Signal Processing and Control journal homepage: www.elsevier.com/locate/bspc
A hybrid BCI-controlled smart home system combining SSVEP and EMG for individuals with paralysis Xiaoke Chai a , Zhimin Zhang a , Kai Guan a , Yangting Lu a , Guitong Liu a , Tengyu Zhang a,b , Haijun Niu a,∗ a Beijing Advanced Innovation Center for Biomedical Engineering, School of Biological Science and Medical Engineering, Beihang University, Beijing 100191, China b National Research Center for Rehabilitation Technical Aids, Beijing 100176, China
a r t i c l e
i n f o
Article history: Received 10 October 2018 Received in revised form 13 September 2019 Accepted 5 October 2019 Keywords: hybrid Brain Computer Interface (hBCI) Electromyography (EMG) Steady State Visual Evoked Potential (SSVEP) Paralysis smart home system
a b s t r a c t In this study, electromyogram (EMG) signals associated with occlusal movement were integrated with steady-state visual evoked potentials (SSVEPs) to develop a hybrid brain–computer interface (hBCI)based smart home control system for individuals with paralysis. The SSVEP paradigm was used to develop a system containing one main interface and five sub-interfaces corresponding to several devices during the working state, and one interface for the idle state. Participants controlled the devices by gazing at certain stimuli, which flickered at different frequencies for each function. Classical correlation analysis (CCA) of four channel EEG signals was used to recognize SSVEP features as intended selection. Several particular occlusal EMG patterns from the single channel of temporalis muscle were used to confirm the selected function, return from the sub-interface to the main interface, and switch the system on/off, respectively. Five healthy participants and five individuals with paralysis completed the system control experiment. The average target selection accuracy reached 97.5% and 83.6% in healthy participants and patients, while the confirmation accuracy in each group reached 97.6% and 96.9%, respectively. When SSVEPs were combined with EMG signals from occlusal movement to confirm the target selection, the actual control accuracy was maximized to 100%, and the information transmission rate (ITR) reached 45 bit/min among patients. Operation of the hBCI-based smart home control system did not cause higher mental or physical workload in patients compared to healthy participants. Our findings indicate that combining SSVEP and EMG signals effectively enhances the safety and interactivity of hBCI-based smart home systems. © 2019 Elsevier Ltd. All rights reserved.
1. Introduction Brain–computer interface (BCI) allows for direct communication between the human brain and external devices, which can be an alternative approach of neuromuscular path-way [1]. Recently, rapid developments in cognitive neuroscience, computer science, and rehabilitation engineering have increased interest in BCI research [2]. Neural engineering studies have indicated that BCI technology may be useful in the rehabilitation of severe paralysis following amputation, stroke, or spinal cord injury (SCI) by restoring communication between the patient’s brain and the daily living environment [3].
∗ Corresponding author. E-mail address:
[email protected] (H. Niu). https://doi.org/10.1016/j.bspc.2019.101687 1746-8094/© 2019 Elsevier Ltd. All rights reserved.
BCIs can be implemented using various modalities, such as magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), near infrared spectroscopy (NIRS), and electroencephalography (EEG) [4–6]. In particular, BCIs based on non-implantable EEG have been widely utilized due to their high safety, low cost, and ease of acquisition [7]. EEG-based BCIs can rely on slow cortical potentials, motor imagery (MI), the P300 component, or steady-state visual evoked potential (SSVEP) [8–12]. Among these, SSVEP-based BCIs have received increasing attention, since it is a natural response of brain evoked by continuous visual stimulation, it can be used without training and are associated with a relatively higher information transmission rate (ITR) than other modalities [13]. The paradigms of SSVEP-based BCI can be modified in various ways, in which each stimulus displayed on the screen corresponds to a certain EEG feature [14]. Rebsamen et al. [15] demonstrated that SSVEP-based BCI systems can be used to control a wheelchair, while Shyu et al. [16]
2
X. Chai, Z. Zhang, K. Guan et al. / Biomedical Signal Processing and Control 56 (2020) 101687
reported that such systems can be used to control a hospital bed. In order to develop BCI-based smart home systems, the safety and accuracy of the current systems must be improved. Most BCIs based on a single EEG paradigm are capable of controlling a single device, the communication of multiple devices using BCI system were expected to achieved. In addition, single-mode BCIs are usually synchronous and entail constant stimulation, causing user fatigue and reducing the accuracy of control [17]. Moreover, most previous studies have included healthy participants [18]. However, patients with severe paralysis may also exhibit cognitive impairment and abnormal spontaneous EEG activity, current systems involve complex paradigms which require long-term training are not applicable either. Therefore, further research is required to develop a simple, home-based BCI system with a user-friendly interface and less training. Given these issues, researchers have proposed hybrid BCI (hBCI) systems that combine two or more EEG paradigms to improve system performance. Pfortscheller et al. [19] introduced an eventrelated desynchronization (ERD) potential to SSVEP-BCI to improve the accuracy of the system. Allison et al. [20] combined mu rhythms and SSVEP to achieve asynchronous control, while Li et al. [21] combined SSVEP and the P300 potential to reduce the false positive rate of an environmental control system. Bi et al. [22] propose a novel two-dimension cursor control system by using steady state visual SSVEP and P300 signals to control the direction and speed of the cursor, respectively. Other forms of hBCI have incorporated various physiological signals from electrocardiogram (ECG), electrooculogram (EOG), or electromyograms (EMG) [23,24]. For example, Kenji et al. [25] combined EMG signals and SSVEP to achieve control of an exoskeleton rehabilitation training robot, while Lin et al. utilized forearm EMG features during fist closure to enhance localization of the SSVEP stimulus in a BCI spelling system [26]. However, as patients with severe paralysis may only be able to utilize their facial muscles for control of BCI systems, systems based on occlusal movements may be more appropriate in this patient population. Chang et al. [27] developed an hBCI system that could be controlled based on masseter EMG signals. Li et al. [28] further demonstrated that different occlusal movements could be used to move and stop an hBCI-based wheelchair. Although Muhammad et al. [29] designed an hBCI system that relies on alpha waves, SSVEP, EOG signals, and EMG signals associated with occlusal movement, the system was difficult to use and exhibited severe delay. Thus, further work is required to develop safe hBCI systems that can be employed by individuals with paralysis with minimal workload. In the present study, EMG signals associated with occlusal movement (jaw clenching) were integrated into an SSVEP-BCI to develop an hBCI-based smart home control system for individuals with severe paralysis. This hBCI-based system was designed using simple visual interfaces. Four-channel EEG signals were selected to recognize the intended command, while one-channel EMG signals were used to confirm the target, return to the main menu, and switch the system on/off, effectively enhancing the interactivity and safety of the system. To validate the reliability of the system and the performance of the proposed function, we tested the system in five healthy participants and five individuals with paralysis.
2. Methods 2.1. Smart home system The hBCI system consists of two parts: the EEG/EMG fusion module and the smart home control module. As shown in Fig. 1, the
fusion module includes a visual stimulation interface, as well as signal acquisition/processing, feature classification, command coding, and feedback systems. The stimulus interface is presented on a laptop LCD monitor and was programmed using the Psychtoolbox in MATLAB (Mathworks, Inc) [30]. All calculations are performed on the working computer, while EEG/EMG input and the resultant output are transmitted via Transmission Control Protocol/Internet Protocol (TCP/IP). Fig. 2 shows the smart home environment, in which the control module was compatible with various devices, enabling wireless communication with beds, wheelchairs, TVs, telephones, curtains, and lamps via ZigBee networking. The continuous voltage signals, switch signals and digital signal were coded and sent by the control module to control the different devices and functions. The control code were from the fusion decision based on recognition results of EEG and EMG. The system was implemented with each device as follows: for Nursing Bed, continuous voltage signals were used to control the relay, with each output corresponding to a certain height of movement or rotation angle, enabling movement of the full bed or a portion of the bed; For Wheelchair, switch signals were used to change the position of the wheelchair, while continuous voltage signals were used to achieve continuous moving and turning. An infrared module was adopted to enable avoidance of obstacles during wheelchair use, at which point “protection mode” is activated; Digital signals from the control module were used to control the television, enabling the user to move the mouse from four directions, adjust the volume, return to the homepage, and switch the power on/off, respectively. Encoding of digital signals enabled users to operate a simple telephone designed for older adults. The system enabled users to dial three emergency numbers, switch to hands-free mode, and end the call. Continuous voltage signals were used to control a scroll-type curtain, the movement of which could be stopped using switch signals. Switch signals were also used to control the power to two lights: a dome light and desk lamp. 2.2. Functional interface As shown in Fig. 3, seven graphical user interfaces (GUIs) were designed for the hBCI system. The system interface consists of three levels. The first level contains an idle interface, the second level contains the main interface, and the third level consists of five sub-interfaces for each device (wheelchair, hospital bed, television, telephone, curtain/lights). Each interface contains a certain number of stimulus targets, and the function of each stimulus is represented by a graphical indicator. The arrangement of each frequency were showed in Fig. 3. The numbers in each of the interfaces correspond to the flickering frequency (Hz) of each target. After switching to the working state, users can select the appropriate device and function by gazing at the relevant stimulus. Once the device has been selected, the system presents the corresponding GUI. In each interface, the stimulus flickers at a certain frequency. When the command has been identified, the stimulus is displayed in green, and the user is presented with a prompt to confirm execution of the selected function. The graphical icons used in each interface are shown in Fig. 3. In the Wheelchair interface, the icons correspond to standing, lying, and sitting positions, as well as forward/backward and left/right movement. In the Nursing bed interface, the icons correspond to raising/lowering the bed, turning left/right, and moving the back or legs up/down. In the Television interface, the icons correspond to up/down/left/right movement of the selection box, turning the volume up/down, returning to the homepage, and switching the power on/off. In the Telephone interface, the icons correspond to dialing three emergency numbers, hands-free mode, and ending the
X. Chai, Z. Zhang, K. Guan et al. / Biomedical Signal Processing and Control 56 (2020) 101687
3
Fig. 1. System Modules.
Fig. 2. Smart Home Environment.
call. In the Light/Curtain interface, the icons correspond to on/off mode for the dome light and desk lamp, and to moving the curtain up/down. 2.3. Signal processing 2.3.1. SSVEP detection Firstly, the baseline of the collected EEG was removed using average values, and a band-passed filter between 3 and 40 Hz was used to eliminate the low and high frequency band noise. Then, the Classical correlation analysis (CCA) was used to identify SSVEP
frequencies, it is a non-parametric multivariable method to reveal the underlying correlation between two sets of multidimensional variables [9]. It finds a pair of linear transform for the two sets such that the transformed two sets have maximum correlation. Since different interfaces were associated with different numbers of stimulation targets, different stimuli were presented at specific frequencies in certain positions of the screen. For example, in interfaces with eight targets, stimuli were presented at frequencies (fn ) of 8, 9, 10, 11, 12, 13, 14 and 15 Hz, respectively. For CCA, EEG signals from four occipital region channels (PO4, PO3, O1, O2) were calculated as input signals, the reference signals (Yfn ) were composed of
4
X. Chai, Z. Zhang, K. Guan et al. / Biomedical Signal Processing and Control 56 (2020) 101687
Fig. 3. Functional Icons in User Interfaces. The numbers correspond to the flickering frequency (Hz) of each target.
sinusoid and co-sinusoid pairs at the frequency of the stimulus and its second harmonics, as illustrated in Eq. (1), where fs is the sample rate, n is the number of the target, h is the number of harmonics h ∈ [1,2].
point n; EMG(i) refers to the value of EMG data at sample point i, with the window size of W = 20.
Envelope (n) =
sin(2 · fn · t)
1 n+W −1 |EMG (i) | W i=n
(2)
cos(2·f n · t) Yfn = {
... ...
}, t =
1 fs
(1)
sin(2·hf n · t) cos(2·hf n · t)
2.3.2. EMG detection We detected patterns of EMG signal from the channel of temporalis muscle. EMG data were classified using a threshold algorithm based on the integral EMG (iEMG) and the duration of the contraction. First a 20–150 Hz band-pass filter was used for the EMG data. Then the envelope of EMG data was extracted by calculating iEMG, as illustrated in equation (2). Envelope (n) refers to the iEMG at
The duration for EMG detection is one second, the whole EMG data of 1000 points were split into 50 segments. Then iEMG was obtained from the 50 segments, the envelope of the EMG data can be used to detect EMG patterns. A mean filter was then used to smooth the envelope curve, following which the duration of the clench was calculated based on the number of points exceeding the designated threshold. If the duration of the clench was within the specified range, one of three corresponding EMG patterns was detected. EMG Pattern 1 was defined as a one-time clench with a duration of 0.2–1.0 s. EMG Pattern 2 was defined as a two-time clench, with both clenches lasting between 0.2–1.0 s, if there was another one within the one second, it is recognized as Pattern 2, Two-time clench. EMG Pattern 3 was defined as an extended clench lasting longer than 2 s.
X. Chai, Z. Zhang, K. Guan et al. / Biomedical Signal Processing and Control 56 (2020) 101687
5
Fig. 4. Control Procedure.
2.3.3. Fusion protocols The control mode was developed by fusing the three EMG patterns with SSVEPs. The control code is determined by combining the recognition result, system implementation stage and the current interface. The system control commands can be divided into three classes: the Switch command, which is used to convert between the idle and working states; the Return command, which is used to return from the device interface to the main interface; and the Selection & Confirmation command, which allows users to select and verify the target they wish to control. The Switch and Return commands are determined based on recognition of EMG Patterns 3 and 2, respectively. The Selection & Confirmation command is a compound command that is co-determined based on the recognition of SSVEPs and EMG Pattern 1. The control procedure is illustrated in Fig. 4. During the idle state, EMG Pattern 3 is used to turn on the system. Two steps are required to switch from the main interface to sub-interfaces: a device-selection stage lasting 3 s and a device-confirmation stage lasting 1 s. In the device-selection stage, devices on the main interface are chosen by gazing at the stimuli. In the device-confirmation stage, the flicker whose SSVEP features were recognized is labeled in green, following which EMG Pattern 1 is used to reach subinterfaces. Device functions on each sub-interface are also selected based on SSVEP recognition and confirmed based on EMG pattern recognition. EMG Pattern 2 was used to return to the main interface from the sub-interfaces. In the sub-interfaces, the key command
¨ the wheelchair and curtains were also discriminated based s¨ topof on EMG Pattern 1.
3. Experiment 3.1. Participants Five healthy volunteers (three men and two women, mean age: 23.8 ± 1.1 years) and five individuals with paralysis (three men and two women, mean age: 48.8 ± 7.9 years) participated in the control experiment. No participants had prior experience using any BCI-based system. All had normal or corrected to normal vision, as well as the ability to control their eye gaze. All the participants provided informed written consent before taking part in the experiment which was approved by the Ethics Committee of The Rehabilitation Hospital of National Research Center for Rehabilitation Technical Aids. Individuals with psychiatric disorders and cognitive impairment (such as tristimania, epilepsy or Alzheimer disease) were excluded from the study. The clinical information of the five individuals with paralysis is shown as Table 1. Given is the age and sex, the duration (days since the injury), the lesion location, their effected motor function. And the Level and severity of injury of SCI were assessed based on criteria outlined by the American Spinal Cord Injury Association (ASIA) [31].
6
X. Chai, Z. Zhang, K. Guan et al. / Biomedical Signal Processing and Control 56 (2020) 101687
Table 1 Clinical information of patients. ID
Age
Sex
Diagnosis
Duration (days)
Lesion Location
Affected Function
P1 P2 P3 P4 P5
61 47 51 65 20
M M M F F
Hemorrhagic Stroke Hemorrhagic Stroke Spinal Cord Injury Spinal Cord Injury Encephala paralysis
61 32 4748 180 4382
Right Pontine Left Pontine,bilateral centrum ovale 4th Thoracic vertebra 11th Thoracic vertebra Cerebral
Left lower limbs Right lower limbs ASIA Grade A ASIA Grade B Bilateral lower limb
3.2. Online verification EEG and EMG signals were simultaneously collected at a sampling rate of 1,000 Hz using the Synamp2 system (Neuroscan Inc). EEG electrodes were placed in accordance with the international 10–20 system and the reference electrode was located at the vertex. The EEG channel contains the four occipital region channels (PO4, PO3, O1, O2). EMG electrode was located on the lower edge of the temporalis muscle. It is placed on the side of head, and the font upper part of ear. In all cases, EEG electrode impedance was lower than 10 K, while EMG electrode impedance was lower than 20 K; the electrode impedance was confirmed before the experiment. During the experiment participants were seated in a comfortable chair at a distance of approximately 70 cm from the monitor (eye level).The visual stimuli were presented on an LCD monitor (14-inch; 1366 × 768 pixels, 60 Hz refresh rate), and the flickers were realized using the sampled sinusoidal stimulation method [32]. The dynamic range of stimulus luminance is from 0 to 1 where 0 represents black and 1 represents white. The screen refresh rate was F, the stimulus luminance of the target in the ith frame with flash frequency of f was: stim (i, f) =
1 2
1 + sin 2f
i F
(3)
Each participant was instructed to complete the control task by following the indicated icons in accordance with the prescribed procedures. The experiment included 37 target Selection & Confirmation commands, five Return commands, and two Switch commands. Participants were first instructed to open the main interface via an extended clench (EMG Pattern 3), following which they were required to select each sub-interface by gazing at visual stimulator and confirm the selected command via a one-time clench (EMG Pattern 1). After finishing the commands in each subinterface, they were instructed to return to the main interface via a two-time clench (EMG Pattern 2). Lastly they need to close the system using an extended clench. Following the task, each participant completed a subjective evaluation of brain workload (NASA Task Load Index [NASA-TXL]) [33]. 3.3. Evaluation
3) Control Accuracy: the ratio of the number of correct Control outputs to the total number of target commands 4) Return Accuracy: the ratio of number of correct Returns to the total number of return commands 5) Switch Accuracy: the ratio of the number of correct Switches to the total number of Switch commands. Moreover, the Information Transfer Rate (ITR) [34] was calculated to evaluate the BCI system, higher ITR means that the system can transfer more information per unit of time. As illustrated in equation (3), where N is the number of targets, P is the mean control accuracy, in this study for different interfaces the target number N = 4, 5, 7, 8; And T is the time which contained two parts, the 3-s stimulation and the 1-s choice confirmation. ITR =
60 log 2 [N + Plog 2 P + (1 − P) log 2 T
1−P N−1
]
(4)
In addition, the NASA-TXL score has been used as an indicator of BCI system performance [21,35]. The scale includes six domains corresponding to mental demand, physical demand, temporal demand, effort level, performance, and frustration levels. The subjective workload for each factor is rated using a 20-step bipolar scale, with scores ranging from 0 to 100. For scores of mental demand, physical demand, temporal demand, effort level and frustration level, low scores represent low workload [33]. Afterward, Mann-Whitney U tests were used to test the difference of the command accuracy and workload scores between the two groups. Statistical significance was defined as p < 0.05. Statistical analyses were conducted using SPSS software (IBM SPSS Statistics, IBM Corporation). 4. Results 4.1. Control accuracy The average selection, confirmation, return and switch accuracies for healthy participants was shown in Table 2. The average selection accuracy was 97.5%, although three participants selected incorrect targets, they did not confirm these selections. Thus, the control accuracy of the system was 100%. The average return and switch accuracies were 87.6% and 80.0%, respectively. The average selection, confirmation, return and switch accuracies for individuals with paralysis was shown in Table 3. From Mann-Whitney U tests, no significant differences in confirmation, return or switch accuracy (based on EMG patterns) were observed between the two groups. However, there were significant differences (P = 0.008, * means P < 0.05 in the table) in selection accuracy (based on SSVEP recognition) between individuals with paralysis and healthy participants.
The online test system recorded several important evaluation indices, such as the number of executions required to successfully complete each command (Selection, Confirmation Switch and Return). If the recognized and target commands were not consistent, a selection error was considered to have occurred, and participants were instructed to repeat the command. Correct control output was defined only when both target selection and confirmation were correct. The following indices were defined to evaluate system performance:
4.2. Information transmission rate (ITR)
1) Selection Accuracy: the ratio of the number of correct Selections to the total number of target selections 2) Confirmation Accuracy: the ratio of the number of Confirmations to the total number of correct selections
Without the EMG-based target confirmation procedure, incorrect outputs would have resulted in the control accuracies of the healthy participants and individuals with paralysis drop to their respective selection accuracy of 97.9% and 86.3%. However, the
X. Chai, Z. Zhang, K. Guan et al. / Biomedical Signal Processing and Control 56 (2020) 101687
7
Table 2 Accuracy of Healthy Participants.
N1 N2 N3 N4 N5 Average
Selection Accuracy
Confirmation Accuracy
Control Accuracy
Return Accuracy
Switch Accuracy
37/37 37/38 37/40 38/39 38/38 97.5 ± 1.4%*
37/37 37/37 37/37 36/38 36/38 97.9 ± 1.3%
37/37 37/37 37/37 37/37 37/37 100 ± 0%
5/5 5/5 5/6 5/7 5/6 87.6 ± 5.5%
2/3 2/3 2/2 2/3 2/2 80.0 ± 8.2%
Selection Accuracy
Confirmation Accuracy
Control Accuracy
Return Accuracy
Switch Accuracy
37/45 38/51 37/43 38/43 40/46 83.6 ± 2.5%*
37/37 37/38 36/37 37/38 37/40 96.9 ± 1.2%
37/37 37/37 37/37 37/37 37/37 100 ± 0%
5/5 5/6 5/5 5/5 5/6 93.3 ± 4.1%
2/3 2/2 2/5 2/2 2/2 81.3 ± 12.2%
Table 3 Accuracy of Individuals with Paralysis.
P1 P2 P4 P5 P6 Average
confirmation procedure enabled participates to cancel incorrect selections, resulting in a control accuracy of 100%, although confirmation increased the time from 3 s to 4 s. In Fig. 5, the blue line corresponds to the ITR of healthy participants for different interfaces with different numbers of targets, without the confirmation procedure. The red line corresponds to the ITR of individuals with paralysis without the confirmation procedure. The black line shows that, for individuals with paralysis, using the EMG-based confirmation procedure, ITR was much higher: 45 bit/min when n = 8. The difference between the red and
blue lines indicates that the ITR of individuals with paralysis without Confirmation was lower than normal subjects. The black line, however, indicates that Confirmation can improve ITR, further suggesting Confirmation for individuals with paralysis is useful and necessary. 4.3. Brain workload As shown in Fig. 6, the mental and physical workload and feelings of frustration scores during operation of the hBCI system were
Fig. 5. ITR of the hBCI System.
Fig. 6. NASA-TXL Scores.
8
X. Chai, Z. Zhang, K. Guan et al. / Biomedical Signal Processing and Control 56 (2020) 101687
lower than 30 across all participants. The performance scores were higher than 50, and the effort to accomplish their level of performance was lower than 40. This implies that individuals with paralysis were satisfied with the proposed system. However, mental and physical workload and feelings of frustration were slightly higher among individuals with paralysis. Mann-Whitney U tests revealed no significant differences between the groups with regard to any indices (P value is shown in Fig. 6), all of which were less than 40 except for the performance.
5. Discussion and Conclusion In the present study, we developed an hBCI-based smart home control system that integrates EMG signals associated with occlusal movement and SSVEPs. The simple graphical interface of the system and visual feedback resulted in improved interactivity of the system. Relative to other home system based on SSVEP which using continuously flickering stimuli [36], our EMG-based controls mode may relieve visual stress, the recognition accuracy was high among healthy controls who had no experience using a BCI system. Although selection accuracy was lower in individuals with paralysis, the EMG-based confirmation procedure enabled all participants to cancel incorrect commands, thus improving the safety and efficacy of the system. Relative to other systems in which SSVEPs were used for error correction [37], our EEG/EMGbased method provided significant advantages with regard to both accuracy and ITR. Moreover, the system achieved higher control accuracy and a lower rate of false positives than other BCI-based home control systems, such as the BCI-based Environmental Control System using P300 [21]. The present hBCI system incorporated one-channel EMG and four-channel EEG signals, enabling complex control of a home environment based on less input than previous systems [21,36]. No participant-specific parameters were required for the recognition of SSVEP and EMG patterns, and all users shared the same setup. EMG signals associated with contraction of the temporalis muscle achieved high confirmation, return, and switch accuracies. Relative to a previous SSVEP-BCI home system that utilized only one EMG pattern [38], our system enriched the function of the system, enabled users to switch among interfaces and exhibited superior performance. Our system is also advantageous in that it is “plug and play”, meaning that it can be shut down or initiated at will. In previous BCI home control systems, control intentions were detected based on EEG signals [21], which may result in inaccurate output. The use of signals associated with simple clench for switch commands effectively prevented incorrect operations during the idle state. The EEG/EMG fusion approach of the present study takes full advantage of the high-accuracy and multi-target selection capabilities of the SSVEP paradigm, as well as the fast and stable recognition of EMG patterns. Incorporating simple visual and clench-based commands not only enhances the safety of the system, but also renders the system feasible for use among individuals with paralysis. Compared to other BCI home system based on P300 or MI paradigms which were used for individuals with SCI [21,39], individuals with stroke whose spontaneous EEG may be abnormal also can operate devices by our approach. The occlusal motion mode is flexible to control, and EMG signals from the temporalis can be easily recorded simultaneously with other adjacent EEG channels. Furthermore, our results indicated that operation of the hBCI-based smart home control system did not cause high mental or physical workload, and that individuals with paralysis did not experience significantly greater workload than healthy participants.
In addition, the proposed hBCI system can be expanded to include other functions, as it utilizes a tree-like GUI that can be adapted to different environments by integrating more devices or adjusting the target functions. In online experiments, there were several failures in occlusion pattern that recognition and resulted in repeated operations of return and switch command; we found that some clenches were too slight by analyzing EMG data. The EEG/EMG fusion component can also be used independently of the control module, which may help users become familiar with the fusion control mode and clench patterns, thus improving the accuracy and safety of the system. In this study, EMG signals of occlusal movement were integrated with SSVEPs to develop an hBCI-based smart home control system for individuals with paralysis. When SSVEPs were combined with EMG signals for simple occlusal movement to confirm the target action, the actual control accuracy and the ITR was maximized among individuals with paralysis. Three EMG patterns from occlusion movement were used in the study, but we believe that more EMG patterns based on clench can be used in the hBCI system. Further studies should improve the recognition accuracy of various clench patterns to explore which clench pattern is more suitable for the certain control command of different devices. In addition, the signal acquisition device in this work was a 64-channel EEG cap, which is not only expensive but also inconvenient to wear compared to device in other study [36]. Further research should develop wearable signal acquisition equipment that only employs specific channels to promote practical application of smart home control systems for individuals with severe paralysis. Acknowledgements This work was supported by the National High Technology Research and Development Program of China (Grant No. 2015AA042304). The authors would also like to thank all of the participants who generously volunteered their time to participate in this study. The authors declare that they have no competing interests. Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. References [1] J.R. Wolpaw, N. Birbaumer, W.J. Heetderks, D.J. Mcfarland, Brain-computer interface technology: a review of the first international meeting, IEEE Trans. Neural Syst. Rehabil. Eng. 11 (2) (2003) 94. [2] J.J. Daly, J.E. Huggins, Brain-computer interface: current and emerging rehabilitation applications, Arch. Phys. Med. Rehabil. 96 (3) (2015) S1–S7. [3] T.M. Vaughan, Brain-computer interface adds a new dimension, Science 306 (2004) 1878–1879. [4] M.C. Schaeffer, E. Labyt, V. Rohu, N. Tarrin, I. Vergara, S. Cokgungor, Hand movement decoding from magnetoencephalographic signals for BCI applications, Neurophysiol. Clin. 46 (2) (2016) 104. [5] R. Sergio, B. Niels, S. Ranganatha, Abnormal neural connectivity in schizophrenia and fMRI-brain-computer interface as a potential therapeutic approach, Front. Psychiatry 4 (2013) 17. [6] Megan Strait, C. Canning, M. Scheutz, Limitations of NIRS-based BCI for realistic applications in human-computer interaction, International Brain-Computer Interface Meeting (2013) 6–7. [7] J.R. Wolpaw, Brain-computer interfaces as new brain output pathways, J. Physiol. 579 (3) (2010) 613–619. [8] T. Hinterberger, S. Schmidt, N. Neumann, J. Mellinger, Brain-computer communication and slow cortical potentials, IEEE Trans. Biomed. Eng. 51 (6) (2004) 1011–1018. [9] G. Bin, X. Gao, Z. Yan, B. Hong, S. Gao, An online multi-channel SSVEP-based brain-computer interface using a canonical correlation analysis method, J. Neural Eng. 6 (4) (2009), 046002.
X. Chai, Z. Zhang, K. Guan et al. / Biomedical Signal Processing and Control 56 (2020) 101687 [10] F. Aloise, et al., P300-based brain-computer interface for environmental control: an asynchronous approach, J. Neural Eng. 8 (2) (2011), 025025. [11] F. Aloise, F. Schettini, P. Aricò, F. Leotta, S. Salinari, D. Mattia, Neurofeedback-based motor imagery training for brain-computer interface (BCI), J. Neurosci Methods. 179 (1) (2009) 150–156. [12] M. Wang, I. Daly, B.Z. Allison, J. Jin, Y. Zhang, A. Chen, A new hybrid BCI paradigm based on P300 and SSVEP, J. Neurosci. Methods 244 (2015) 16–25. [13] F.B. Vialatte, M. Maurice, J. Dauwels, A. Cichocki, Steady-state visually evoked potentials: focus on essential paradigms and future perspectives, Prog. Neurobiol. 90 (4) (2010) 418–438. [14] X. Chen, Y. Wang, M. Nakanishi, X. Gao, T.P. Jung, S. Gao, High-speed spelling with a noninvasive brain-computer interface, Proc. Natl. Acad. Sci. U. S. A. 112 (44) (2015) E6058. [15] B. Rebsamen, C. Guan, H. Zhang, C. Wang, C. Teo, A.M. Jr, A brain controlled wheelchair to navigate in familiar environments, IEEE Trans. Neural Syst. Rehabil. Eng. 18 (6) (2010) 590–598. [16] K.K. Shyu, Y.J. Chiu, P.L. Lee, M.H. Lee, J.J. Sie, C.H. Wu, Total design of an FPGA-based brain–computer interface control hospital bed nursing system, IEEE Trans. Indus. Electros. 60 (7) (2013) 2731–2739. [17] S.M.T. Müller, T.F. Bastos, M.S. Filho, Proposal of a SSVEP-BCI to command a robotic wheelchair, J. Control Autom. Electron. Syst. 24 (1-2) (2013) 97–105. [18] D. Zhu, J. Bieger, M.G. Garcia, R.M. Aarts, A survey of stimulation methods used in SSVEP-based BCIs, Comput. Intell. Neurosci. 2010 (2010) 1. [19] G. Pfurtscheller, T. Solis-Escalante, R. Ortner, P. Linortner, G.R. Muller-Putz, Self-paced operation of an SSVEP-based orthosis with and without an imagery-based “brain switch:” a feasibility study towards a hybrid BCI, IEEE Trans. Neural Syst. Rehabil. Eng. 18 (4) (2010) 409–414. [20] B.Z. Allison, C. Brunner, C. Altstätter, I.C. Wagner, S. Grissmann, C. Neuper, A hybrid EDR/SSVEP BCI for continuous simultaneous two dimensional cursor control, J. Neurosci. Methods 209 (2) (2012) 299–307. [21] R. Zhang, Q. Wang, K. Li, S. He, S. Qin, Z. Feng, A BCI-based environmental control system for patients with severe spinal cord injuries, IEEE Trans. Bio. Eng. 99 (2017), 1-1. [22] L. Bi, J. Lian, K. Jie, R. Lai, Y. Liu, A speed and direction-based cursor control system with P300 and SSVEP, Biomed. Signal Proc. Control 14 (1) (2014) 126–133. [23] R. Scherer, G.R. Muller-Putz, G. Pfurtscheller, Self-initiation of EEG-based brain-computer communication using the heart rate response, J Neural Eng. 4 (4) (2007), L23-9. [24] N. Shinde, K. George, Brain-controlled driving aid for electric wheelchairs, IEEE, International Conference on Wearable and Implantable Body Sensor Networks (2016) 115–118.
9
[25] K. Kenji, A BMI-based robotic exoskeleton for neurorehabilitation and daily actions: a hybrid control method using EMG and SSVEP, Front. Human Neurosci. (2015) 9. [26] K. Lin, A. Cinetto, Y. Wang, X. Chen, S. Gao, X. Gao, An online hybrid BCI system based on SSVEP and EMG, J. Neural Eng. 13 (2) (2016), 026020. [27] B.C. Chang, B.H. Seo, Development of new brain computer interface based on EEG and EMG, 2008 IEEE International Conference on Robotics and Biomimetics (2009) 1665–1670. [28] Z. Li, S. Lei, C.Y. Su, G. Li, Hybrid brain/muscle-actuated control of an intelligent wheelchair, in: IEEE International Conference on Robotics and Biomimetics Shenzhen, China, December, 12–14, 2013. [29] M.A. Shah, A.A. Sheikh, A.M. Sajjad, M. Uppal, A hybrid training-less brain-machine interface using SSVEP and EMG signal, International Conference on Frontiers of Information Technology. IEEE Computer Society (2015) 93–97. [30] D.H. Brainard, The psychophysics toolbox, Spat. Vis. 10 (4) (1997) 433–436. [31] R.J. Marino, T. Barros, F. Bieringsorensen, et al., International standards for neurological classification of spinal cord injury, Chin. J. Rehabil. Theory Pract. 37 (2) (2014) 120–127. [32] X. Chen, Z. Chen, S. Gao, X. Gao, A high-ITR SSVEP-based BCI speller, Brain-Computer Interfaces 1 (3–4) (2014) 181–191. [33] S.G. Hart, L.E. Staveland, Development of NASA-TLX (Task Load Index): results of empirical and theoretical research, Adv. Psychol. 52 (6) (1988) 139–183. [34] F.B. Vialatte, M. Maurice, J. Dauwels, A. Cichocki, Steady-state visually evoked potentials: focus on essential paradigms and future perspectives, Prog. Neurobiol. 90 (4) (2010) 418–438. [35] A. Riccio, F. Leotta, L. Bianchi, F. Aloise, C. Zickler, E.J. Hoogerwerf, Workload measurement in a communication application operated through a P300-based brain–computer interface, J. Neural Eng. 8 (2) (2011), 025028. [36] J.S. Lin, C.H. Hsieh, A wireless BCI-controlled integration system in smart living space for patients, Wireless Pers. Commun. 88 (2) (2016) 395–412. [37] J. Pan, Y. Li, R. Zhang, Z. Gu, F. Li, Discrimination between control and idle states in asynchronous SSVEP-based brain switches: a pseudo-key-based approach, IEEE Trans. Neural Syst. Rehabil. Eng. 21 (3) (2013) 435–443. [38] N. Mora, I.D. Munari, P. Ciampolini, A multi-modal bci system for active and assisted living. 9677 (2016) 345–355. [39] N. Kosmyna, F. Tarpin-Bernard, N. Bonnefond, Feasibility of BCI control in a realistic smart home environment, Front. Human Neurosci. 10 (2016) 416.