Volume 9, Issue 5 (September & October 2018 2018)                   BCN 2018, 9(5): 373-388 | Back to browse issues page


XML Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Sadeghi S, Maleki A. Recent Advances in Hybrid Brain-Computer Interface Systems: A Technological and Quantitative Review. BCN 2018; 9 (5) :373-388
URL: http://bcn.iums.ac.ir/article-1-960-en.html
1- Department of Biomedical Engineering, Faculty of New Sciences and Technologies, Semnan University, Semnan, Iran.
Full-Text [PDF 1160 kb]       |   Abstract (HTML) 
Full-Text:  
1. Introduction
ABrain-Computer Interface (BCI) system provides a non-muscular communication channel by creating a direct path between brain and computer aimed at communicating with environment for people suffering from severe paralysis, muscular atrophy, amyotrophic lateral sclerosis or brainstem stroke. BCI consists of sensors and signal processing tools that directly convert brain activity into commands or messages (Müller-Putz et al., 2012; Guger, Allison, & Müller Putz, 2015). The block diagram of the BCI system is illustrated in Figure 1. Brain activity is measurable using several approaches such as electroencephalography, magnetoencephalography, functional Magnetic Resonance Imaging (fMRI), Electrocorticography (ECoG), and Near-Infrared Spectroscopy (NIRS) (Amiri, Rabbi, Azinfar, & Fazel-Rezai 2013).
Electroencephalogram (EEG) signal is often considered as the input in most BCI systems. EEG electrodes are placed on the scalp and special devices record the electric field of neural activity. Six brain rhythms can be distinguished in EEG signal based on the differences in frequency ranges: Delta (1-4 Hz), theta (4-7 Hz), alpha (8-12 Hz), mu (8-13 Hz), beta (12-30 Hz), and gamma (25-100 Hz). Delta rhythm includes all rhythms below 3.5 Hz and is generated during deep sleep, by children or the people with brain disorders.
Theta rhythm is further acquired from temporal and parietal areas and it is visible in children and some adults at the time of stress, disappointment, and heartbreak. Alpha rhythm occurs in awake and eyes closed relax condition. It is significantly recorded from the occipital lobes, but it can also be acquired from parietal and frontal regions. This rhythm completely disappears during sleep and when the subject is attracted to a particular mental activity in the waking state, it is replaced with asynchronous waves with higher frequency and lower amplitude. Beta rhythm is mostly acquired from parietal and frontal areas.
It takes place at frequencies as high as 50 Hz in strong brain activities. It is divided into Beta I and Beta II. Beta I, with the frequency of about twice as Alpha rhythm, and influenced by similar mental activities affecting the alpha rhythm. Beta II appears in the central nervous system during intense activities and in the time of stress. The alpha activity, which is recorded from the sensory-motor areas, is called mu activity. Gamma rhythm acquired from somatosensory cortex is involved in high-level tasks such as cognitive functions and it is important for learning, memory, and data processing (Amiri et al., 2013; Bharne & Kapgate, 2015). 


 
In general, BCI systems are categorized based on the brain activity patterns into four different types: P300 component of event-related potential, Steady-State Visual Evoked Potential (SSVEP), SCP, and Event-Related Synchronization (ERS)/ Event-Related Desynchronization (ERD).
P300 is an Event-Related desynchronization Potential (ERP) that appears approximately 300 ms after a visual, auditory or tactile stimulation. Since P300-based BCI systems are vulnerable against noise, they require averaging of ERP responses from several stimuli. This reduces the speed and Information Transfer Rate (ITR). These systems also need less training and they have high validity among users and patients (Fazel-Rezai & Ahmad, 2011; Fazel-Rezai et al., 2012; Wolpaw & Wolpaw, 2012).
Visual Evoked Potentials (VEPs) are brain oscillations that occur after receiving a visual stimulation. Steady-State Visual Evoked Potential (SSVEP) is a kind of VEP that occurs in response to stimulus with frequencies higher than 6 Hz. Although higher stimulation frequencies reduce fatigue and discomfort, the recognition of the signal is challenging. In general, SSVEP-based BCI systems have many advantages such as better classification accuracy, higher ITR, and fewer numbers of required electrodes, compared to other methods such as P300. These systems do not need training and if necessary, the required time for training is very short. Although SSVEP-based systems are faster than systems based on P300, they have shortcomings such as inappropriateness for patients with epilepsy, requirement of precise control of eye muscles, and the need for high-speed hardware (Guger et al., 2012; Guger et al., 2015).
Slow-Cortical Potentials (SCP) are negative slow potential changes in EEG signals acquired in an imagined or actual movement from sensory-motor cortex of the brain (Allison et al., 2012; Faller, 2012). These potentials belong to the part of EEG signals with frequencies less than 1 Hz and are a reflection of cortical polarization. The employment of these potentials is limited due to reasons such as long duration training time, high error risk, and poor dimensional control.
Mu and Beta rhythms, both recorded from the sensory-motor cortex, are caused by sensory stimulation or motor behavior. These rhythms include two types of amplitude fluctuations named ERD and ERS. A voluntary movement causes a limited asynchronicity in the lower bands of Mu and Beta that is called ERD and it takes place about two seconds before starting the movement. In fact, the decrease in neuronal synchronization reduces power in specific frequency bands and eventually reduces the signal amplitude. After a voluntary movement, with an increase in synchronization of neurons, the power is increased in brain rhythms and reaches its maximum level, 600 ms after the movement that is called ERS. The Motor Imagery (MI) is also a way to change in ERD/ERS favored in BCI applications. This method requires more training and may not applicable on some subjects (Pfurtscheller & Da Silva, 1999).
From another perspective, BCI systems are categorized into synchronous and asynchronous. In a synchronous system, the extraction and processing of signal features are prescheduled. It is based on the protocol that defines the starting and ending of each operation with specified duration. In an asynchronous system, called automated system, feature extraction and processing do not necessarily follow a fixed schedule. 
2. Methods
2.1. Hybrid brain- computer interface systems

Since a BCI system based on one method may not work on all subjects, the Hybrid BCI (HBCI) system is introduced and this area is of interest to many researchers over time. 
The current study aimed at reviewing and analyzing the current state-of-the-art HBCI studies. In this review, articles were sought from the Google scholar database. Inclusion criteria were journal articles written in English from 2010 to December 2016. Other publication forms (eg, books, proceeding papers, master’s and doctoral dissertations, unpublished working papers, newspapers, etc.) were not included. Keywords used in search engines were “Hybrid” AND “Brain computer interface”, “Hybrid” AND “Brain machine interface”, “Brain computer interface” AND “Electroencephalography”, “Brain computer interface” AND “Electrooculography”, “Brain computer interface” AND “Electromyography”, “Brain computer interface” AND “Near-infrared spectroscopy”, “Brain computer interface” AND “evoked potential” OR “Brain computer interface” AND “Steady-state somatosensory evoked potential”.
After conducting the keyword search, some papers were found more than one time with different keywords. Therefore, duplicates were excluded. Figure 2 shows the total number of articles published in different years based on Google Scholar database. There were 13 articles in 2010 and this number significantly rose in the following years. A total number of 60 and 61 articles were published in this context in 2015 and 2016, respectively. Increasing the number of articles published in the realm of HBCI indicates the high efficiency hybrid brain-computer interface of these systems.
In a HBCI, a BCI control signal combines with one or more BCI control signals or with Human Machine Interface (HMI) biosignals. HBCI systems are categorized according to the type of signals combined and the combination technique (Simultaneous/sequential). In simultaneous combination, the systems work concurrently with each other, while in sequential combination they act as time-sharing. In a sequential combination, the target is selected among several options by the first system and the second system does the process on the choice. A comprehensive block diagram with different modes of system operation is presented in Figure 3. This Figure completely describes the concept of system operation in both modes; simultaneous and sequential. The timing of stimulation in operation modes is depicted in this Figure. 
In general, the most important goals of combining signals in HBCI systems are to increase the classification accuracy, enhancing system speed, improving user satisfaction, and overcoming the disadvantages of BCI systems. In contrast, most of these hybrid systems are associated with greater complexity.
2.2. Types of HBCI Systems
To date, different combinations are employed in HBCI systems. Figure 4 shows the number of articles published in different years based on the type of combination, obtained from Google Scholar database. This Figure indicates that in the early years of the employment of HBCI systems, the combination of BCI control signals was used in various studies. Over the time, the combination of BCI control signals with HMI biosignals was also considered. Figure 4 shows the gradual increase in the use of Electromyogram (EMG), Electrooculogram (EOG), and Steady-State Somatosensory Evoked Potentials (SSSEP) in HBCI systems over the time. However, the use of NIRS and eye tracker increased dramatically, especially in recent years. A summary of studies in the field of HBCI systems with an 


 
emphasis on the specific characteristics of each study is noted in Table 1. In the following, with the introduction of a variety of signal combinations in HBCI systems, methods and results of different studies are investigated.
2.2.1. The combination of P300 and SSVEP
Since both SSVEP and P300 are evoked by visual stimulation and none of them requires training, the combination of these two signals is used in various applications such as target selection, movement control, and spellers. In order to control the direction and speed of the movement, simultaneous combination of SSVEP and P300 is associated with shortages such as low speed and ignoring the resting state in synchronous systems (Bi, Lian, Jie, Lai, & Liu, 2014)
P300, associated with high ITR, is considered the main mechanism of data transfer in many applications, including spellers. One of the solutions to increase ITR in P300-based spellers is reducing the number of flash


 


 
es. There is a compromise between ITR and classification accuracy that by reducing one of them, the other one increases. In order to increase ITR, simultaneous combination of P300 and SSVEP is used. To this end, all characters are divided into subareas by two techniques: Row/Column (RC) or Subarea/Location (SL).
All of the characters in each subarea flicker at the same frequency. At the same time, cues highlight the same location in each subarea in a pseudorandom sequence. Thus, only N1 flash codes for P300 and N2 frequencies for SSVEP are required to achieve the spelling of N1×N2 items. The RC mode is a better choice compared with SL mode, because of its higher average, faster speed, and lower standard deviation of ITR (Yin et al., 2014). To increase classification accuracy, simultaneous combination of SSVEP and P300 is used to reduce errors occurred in rows or columns 


 


 


 
containing the target characters. Simultaneous with P300, several SSVEP frequencies are applied; hence, the characters in the same row or column may not have the same frequency (Yin et al., 2013). To increase classification accuracy aimed at detecting the control state (period of time in which the subject is intended to convey information), the sequential combination of P300 and SSVEP is used (Edlinger, Holzner, & Guger, 2011; Panicker, Puthusserypady, & Sun, 2011).
In simultaneous combination of P300 and SSVEP, to avoid any disruption caused by unstable frequency of P300 on the frequency of flickering SSVEP, the P300 is used as the target deformation (Wang et al., 2015). In this approach, using the stop features of SSVEP coupled with changing shape in P300 could also be effective (Xu et al., 2013).
2.2.2. The combination of ERD and SSVEP
SSVEP and ERD combination is employed in various applications such as the control of wheelchair, orthotics, and neural prosthesis. In combination of these signals, the ability to use the desired control signal at any moment leads to the increase in classification accuracy (Allison et al., 2010). This combination could also be effective in increasing the number of control commands. For example, if the classification of four directions of movement is realized using the right and left wrist motor imagery and taking into account the time of imagination, it is not possible yet to control the cursor in several directions in a moment (Bai, Lin, Huang, Fei, & Floeter, 2010). 
SSVEP and ERD combination enable continuous movement in two dimensions simultaneously (Allison et al., 2012). Moreover, the possibility of simultaneous control of direction and speed of the wheelchair is provided and the move/stop command is made by spending a short time (Li et al., 2013; Cao, Li, Ji, & Jiang, 2014; Li et al., 2014). To increase the ITR, the combination of these two signals is practical and better than switching from one state to another, since the fatigue resulting from this approach is not much (Brunner, Allison, Altstätter, & Neuper, 2011). 
To increase classification accuracy, their combination is used to detect resting state in various applications such as opening/closing orthotics and controlling neural prosthetics during grasping. The division of task into two steps and the possibility to turn off the LED after the completion of the first step reduce the fatigue and error rate by reduction of the adverse impact of LED flashes on ERD detection (Pfurtscheller, Solis-Escalante, Ortner, Linortner, & Muller-Putz, 2010; Savić, Kisić, & Popović, 2011).
2.2.3. The combination of P300 and ERD
The most common practical applications of P300 and ERD combination are wheelchair and robot control. In general, the control of these objects is done in two ways. In the first case, several targets are shown against the subject and the subject should select one of them. The subject automatically moves towards the target through a predetermined path. In this case, the individual has no control over the path. In the second case, the subject moves himself closer to the target with the voluntary movements in different directions. In order to automatically move the wheelchair, sequential combination of P300 and ERD is used. 
In the first stage, the target is selected using P300 and the subject moves himself closer to it from a predetermined path. In the second stage, the ERD is used for stop command (Rebsamen et al., 2008; Rebsamen et al., 2010; Riechmann, Hachmeister, Ritter, & Finke, 2011). On the other hand, when the voluntary control of direction and the speed of movement are considered, both simultaneous and sequential combinations are used.
Typically, in applications that employ the simultaneous combination of these two control signals, ERD is used to move in different directions and P300 is applied to control the speed or stop command (Long et al., 2012a). In sequential combination, ERD is used for routing and P300 is applied to achieve the desired object. Utilization of ERD for routing limits the number of commands and the P300 provides the control panel for the subject that allows him the possibility of further tasks (Finke, Knoblauch, Koesling, & Ritter, 2011; Su et al., 2011).
2.2.4. The combination of EEG and EOG
EEG-based systems are a superior technology to increase the communication of patients with disability and or paralysis that cannot move and speak. However, if there is little ability to move eyes in patients, this ability can also be used in conjunction with EEG signals in HBCI systems. Eye movement changes the orientation of the corneal-retinal potential and the electrodes placed around the eyes can record the effects named EOG. The combination of EOG signal and other control signals is used in various applications such as control of virtual keyboard, wheelchair, mobile robot, etc.
The input of eye movement does not require much training and acts very fast. The amplitude of EOG signal is about several microvolts, hence, it could be easily classified with high accuracy. This method is economically affordable since the number of electrodes is few. As an example of this combination in robot control, moving to the right and left direction is obtained using only two EOG electrodes. Direct movement and complete stop are also done by motor imagery and eye closing, respectively (Usakli, Gurkan, Aloise, Vecchiato, & Babiloni. 2009; Punsawad, Wongsawat, & Parnichkun, 2010).

2.2.5. The combination of EEG and Eye-tracker
Eye tracking system is a wearable human-computer interface that provides the possibility to communicate through eye movements and blinking. The combination of this interface and EEG signals could be used in HBCI systems. The main use of this combination is curser movement on the screen. First, the subject guides the curser to the target as quickly as possible and then selects it. Eye motion indicates the cursor movement on the screen and the target is selected by EEG signal. Although the ITR in this combination is less than using mouse, this rate is increased compared with that of BCI (Kim, Kim & Jo, 2015).
2.2.6. The combination of EEG and EMG
Some patients may have little ability to move muscles in some organs. In many applications, this residual motion is not useful to control objects due to muscle weakness, exhaustion or disruption of natural tension. However, this ability can be effectively employed as a second signal in HBCI systems. For each patient, the suitable muscle is selected for electrode placement based on its ability to contract (Lalitharatne, Teramoto, Hayashi, & Kiguchi, 2013). The combination of EMG and motor imagery, P300 and SSVEP, is employed in various applications. SSVEP-based speller despite high ITR, high signal to noise ratio, and no need for training, only has appropriate response in a certain frequency range; it limits the number of target items. 
To increase ITR and the number of characters in spellers, sequential combination of SSVEP and EMG is used, in such a way that all characters are divided into several groups and the ones that are in the same group flicker with different frequencies. The number of muscle activities determines the group number. Hence, after determining the desired group, the target item is selected by SSVEP (Lin, Chen, Huang, Ding, & Gao, 2015). To increase classification accuracy, the simultaneous combination of motor imagery and EMG has relatively better results in comparison with that of BCI system (Leeb, Sagha, Chavarriaga, & del R Millán, 2011). In this regard, P300 and EMG combination can be used to correct the error in spellers. In other words, contrary to BCI systems, which use backspace to delete the wrong letter, it is realized with the EMG in hybrid mode (Riccio et al., 2015).
2.2.7. The combination of EEG and SSSEP
Many people with stroke that their muscles are damaged as well as people who lost the ability of eye gaze 


 
may have the ability to feel stimulation, which can be used in HBCI systems. For example, a combination of steady state somatosensory evoked potential retrieved from selective sensation and motor imagery is used in cursor control. Motor imagery is the activation of efferent motor nerves and selective sensation is receiving afferent neuron inputs related to stimulation perception. ERD and SSSEP are achieved by motor imagery and tactile stimulation, respectively. In the simultaneous combination of these signals, increasing the classification accuracy could not be achieved due to ERD degradation caused by tactile signals (Ahn, Ahn, Cho, & Jun, 2013). In fact, ERD reduces the SSSEP amplitude and selective sensation increases it (Yao, Meng, Zhang, Sheng, & Zhu, 2014a).


 


 


 
2.2.8. The combination of EEG and NIRS
Non-invasive brain imaging technique that uses light with a wavelength range of 600 to 1000 nm is called NIRS. It is used to measure the hemodynamic response due to oxygenated hemoglobin, hemoglobin without oxygen, water, etc. EEG signal has good temporal resolution but poor spatial resolution, while NIRS has moderate temporal and spatial resolution and is also resistant to noise (Coyle, Ward, Markham, & McDarby, 2004; Herff et al. 2015). In HBCI systems, EEG, and NIRS combination is used to increase the number of commands without reduction in classification accuracy. For example, NIRS is used to measure brain activity caused by mental acts (mental counting or performing subtraction) and EEG signal is applied to detect movement (Khan, Hong, Naseer, Bhutta, & Yoon, 2014a).
3. Discussion
The current study assessed different HBCI systems that were the result of the combination of a BCI control signal and other BCI control signals or HMI control biosignals from the perspective of application, capabilities, and limitations. HBCI systems are used in many applications such as object control, movement control, spellers, etc. By the sequential combination of these systems, a complex task can be divided into several stages and only one BCI system is used at each stage. This method of combination assists to reduce the errors by better distinguishing the rest state from the attention state. On the other hand, the main goal of simultaneous combination of these systems is to increase the ITR. Generally, HBCI systems have higher ITR and greater classification accuracy compared to those of the conventional BCI systems, but they are usually more complex. This complexity can affect the ease of use of the system and its acceptance by the user.
From this perspective, the design and implementation of these systems including the number of channels play an important role in the performance of system. Table 2 summarizes results of different studies with an emphasis on the number of channels, ITR, and classification accuracy. The experimental conditions and signal recording considerations are different in these studies. Accuracy and ITR measures are sensitive to the experiment protocol, which makes it difficult to compare the results. Hence, to manage this issue, these measures are presented in the graphical form of Figure 5. In this Figure, various HBCI combinations were compared with two quantitative criteria of classification accuracy and ITR. In this Figure, an ellipse is drawn for the ranges of ITR and classification accuracy of each method.
The center of ellipse and its diameters are set based on the mean and standard deviation of average values listed in Table 2. Classification accuracy and ITR are two important parameters to evaluate a BCI system. It is a tradeoff between these two parameters; as one increases, the other one decreases and vice versa. For a particular application, the increase in accuracy may be considered an advantage or an increase in ITR is desirable. Therefore, concerning the ultimate goal and depending on the application, the appropriate combination type should be determined. By having the correct location of the accuracy and the ITR corresponding to each combination, it is easy to determine the optimal combination.
The current study results showed that in most cases, the combination of a BCI control signal and HMI control biosignal had relatively higher ITR in comparison with those of the combination of the two BCI control signals. According to the Figure, the highest and lowest ITRs were achieved using EEG and Eye Tracker and SSVEP and ERD, respectively. The combination of EEG signal and NIRS also had the lowest classification accuracy in comparison with those of the others, while the accuracy values of other hybrid systems did not differ much from each other and they were located within the range of 70% to 100%. Generally, in using HBCI systems, the combination technique can be determined based on the type of application, the main goal, as well as the capabilities of patients.
Ethical Considerations
Compliance with ethical guidelines

There is no ethical principle to be considered doing this research.
Funding
This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.
Authors contributions
The Authors contributions is as follows: Conceptualization, methodology, resources and supervision: Ali Maleki; Investigation and writing–original draft: Sahar Sadeghi; and Writing–review & editing: Both authors. 
Conflict of interest
The authors declared no conflict of interest.


References
  1. Ahn, S., Ahn, M., Cho, H., & Jun, S. C. (2013). hybrid brain-computer interface based on motor imagery and tactile selective attention. In N.L. Millán, R. Leeb, J.D.R. Millán, N. Guechoul, T. Carlson, A. Sobolewski, et al. (Eds.), Proceedings of TOBI Workshop IV (pp. 51-52). Switzerland: École polytechnique fédérale de Lausanne.
  2. Faller, J. (2012). BCIs that use steady-state visual evoked potentials or slow cortical potentials. In J. Wolpaw, & E. Winter Wolpaw (Eds.), Brain-Computer Interfaces, Principles and Practice. Oxford: Oxford University Press.
  3. Allison, B. Z., Brunner, C., Altstätter, C., Wagner, I. C., Grissmann, S., & Neuper, C. (2012). A hybrid ERD/SSVEP BCI for continuous simultaneous two dimensional cursor control. Journal of Neuroscience Methods, 209(2), 299-307. [DOI:10.1016/j.jneumeth.2012.06.022] [PMID]
  4. Allison, B. Z., Brunner, C., Kaiser, V., Müller-Putz, G. R., Neuper, C., & Pfurtscheller, G. (2010). Toward a hybrid brain-computer interface based on imagined movement and visual attention. Journal of Neural Engineering, 7(2), 026007. [DOI:10.1088/1741-2560/7/2/026007] [PMID]
  5. Al Shargie, F., Kiguchi, M., Badruddin, N., Dass, S. C., Hani, A. F. M., & Tang, T. B. (2016). Mental stress assessment using simultaneous measurement of EEG and fNIRS. Biomedical Optics Express, 7(10), 3882-98. [DOI:10.1364/BOE.7.003882] [PMID] [PMCID]
  6. Amiri, S., Rabbi, A., Azinfar, L., & Fazel-Rezai, R. (2013). A review of P300, SSVEP, and hybrid P300/SSVEP brain-computer interface systems. In R. Fazel-Rezai (Eds.), Brain-Computer Interface Systems-Recent Progress and Future Prospects. Intech Open.  [DOI:10.5772/56135]
  7. Bai, O., Lin, P., Huang, D., Fei, D. Y., & Floeter, M. K. (2010). Towards a user-friendly brain-computer interface: Initial tests in ALS and PLS patients. Clinical Neurophysiology, 121(8), 1293-1303. [DOI:10.1016/j.clinph.2010.02.157] [PMID] [PMCID]
  8. Bharne, P. P., & Kapgate, D. (2015). Hybrid visual BCI combining SSVEP and P300 with high ITR and accuracy. International Journal of Computer Science and Mobile Computing, 4(6), 1-5.
  9. Bhattacharyya, S., Konar, A., & Tibarewala, D. N. (2014). “Motor imagery, P300 and error-related EEG-based robot arm movement control for rehabilitation purpose. Medical & Biological Engineering & Computing, 52(12): 1007-17. [DOI:10.1007/s11517-014-1204-4] [PMID]
  10. Bi, L., Lian, J., Jie, K., Lai, R., & Liu, Y. (2014). A speed and direction-based cursor control system with P300 and SSVEP. Biomedical Signal Processing and Control, 14, 126-33. [DOI:10.1016/j.bspc.2014.07.009]
  11. Breitwieser, C., Pokorny, C., & Gernot, R. M. (2016). A hybrid three-class brain? Computer interface system utilizing SSSEPs and transient ERPs. Journal of Neural Engineering, 13(6), 066015. [DOI:10.1088/1741-2560/13/6/066015] [PMID]
  12. Brunner, C., Allison, B. Z., Altstätter, C., & Neuper, C. (2011). A comparison of three brain–computer interfaces based on event-related desynchronization, steady state visual evoked potentials, or a hybrid approach using both signals. Journal of Neural Engineering, 8(2), 025010. [DOI:10.1088/1741-2560/8/2/025010] [PMID]
  13. Buccino, A. P., Keles, H. O., & Omurtag, A. (2016). Hybrid EEG-fNIRS asynchronous brain-computer interface for multiple motor tasks. PloS One, 11(1), e0146610. [DOI:10.1371/journal.pone.0146610] [PMID] [PMCID]
  14. Cao, L., Li, J., Ji, H., & Jiang, C. (2014). A hybrid brain computer interface system based on the neurophysiological protocol and brain-actuated switch for wheelchair control. Journal of Neuroscience Methods, 229, 33-43. [DOI:10.1016/j.jneumeth.2014.03.011] [PMID]
  15. Capati, F. A., Bechelli, R. P., & Castro, M. C. F. (2016). Hybrid SSVEP/P300 BCI keyboard. In J. Gilbert, H. Azhari, H. Ali, C. Quintão, J. Sliwa, C. Ruiz, et al., (Eds.), Proceedings of the International Joint Conference on Biomedical Engineering Systems and Technologies (pp. 214-8). Setúbal: SciTePress-Science and Technology Publications. [PMID]
  16. Chang, M. H., Lee, J. S., Heo, J., & Park, K. S. (2016). Eliciting dual-frequency SSVEP using a hybrid SSVEP-P300 BCI. Journal of Neuroscience Methods, 258, 104-13. [DOI:10.1016/j.jneumeth.2015.11.001] [PMID]
  17. Coyle, S., Ward, T., Markham, C., & McDarby, G. (2004). On the Suitability of Near-Infrared (NIR) systems for next-generation brain-computer interfaces. Physiological Measurement, 25(4), 815. [DOI:10.1088/0967-3334/25/4/003] [PMID]
  18. Dong, X., Wang, H., Chen, Z., & Shi, B. E. (2015). Hybrid brain computer interface via Bayesian integration of EEG and eye gaze. Paper presented at the 17th International IEEE/EMBS Conference on Neural Engineering (NER), Montpellier, France, 2 July 2015. [DOI:10.1109/NER.2015.7146582]
  19. Edlinger, G., Holzner, C., & Guger, C. (2011). A hybrid brain-computer interface for smart home control. Paper presented at the International Conference on Human-Computer Interaction, Las Vegas, 15-20 July 2018. [DOI:10.1007/978-3-642-21605-3_46]
  20. Évain, A., Argelaguet, F., Casiez, G., Roussel, N., & Lécuyer, A. (2016). Design and evaluation of fusion approach for combining brain and gaze inputs for target selection. Frontiers in Neuroscience, 10, 454. [DOI:10.3389/fnins.2016.00454] [PMID] [PMCID]
  21. Fazel-Rezai, R., & Ahmad, W. (2011). P300-based brain-computer interface paradigm design. In U. Hoffmann, J.M. Vesin, T. Ebrahimi (Eds.), Recent Advances in Brain-Computer Interface Systems. IEEE Xplore Digital Library.
  22. Fazel Rezai, R., Allison, B. Z., Guger, C., Sellers, E. W., Kleih, S. C., & Kübler, A. (2012). P300 brain computer interface: Current challenges and emerging trends. Frontiers in Neuroengineering, 5, 14. [DOI:10.3389/fneng.2012.00014] [PMID] [PMCID]
  23. Fazli, S., Mehnert, J., Steinbrink, J., Curio, G., Villringer, A., Müller, K. R., et al. (2012). Enhanced performance by a hybrid NIRS–EEG brain computer interface. Neuroimage, 59(1), 519-29. [DOI:10.1016/j.neuroimage.2011.07.084] [PMID]
  24. Finke, A., Knoblauch, A., Koesling, H., & Ritter, H. (2011). A hybrid brain interface for a humanoid robot assistant. Paper presented at 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, 30 August-3 September, 2011. [DOI:10.1109/IEMBS.2011.6091728] [PMID]
  25. Guger, C., Allison, B. Z., Großwindhager, B., Prückl, R., Hintermüller, C., Kapeller, C., et al. (2012). How many people could use an SSVEP BCI?. Frontiers in Neuroscience, 6, 169. [DOI:10.3389/fnins.2012.00169] [PMID] [PMCID]
  26. Guger, C., Allison, B. Z., & Müller Putz, G. R. (2015). Brain-Computer Interface Research: A state-of-the-art summary 4. In Ch. Guger, B.Z. Allison, G. Edlinger (Eds.), Brain-Computer Interface Research (pp. 1-8). Berlin: Springer. [DOI:10.1007/978-3-319-25190-5]
  27. Herff, C., Fortmann, O., Tse, C. Y., Cheng, X., Putze, F., Heger, D., & Schultz, T. (2015). Hybrid fNIRS-EEG based discrimination of 5 levels of memory load. Paper presented at 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). Montpellier, France, 2-24 April 2015. [DOI:10.1109/NER.2015.7146546]
  28. Ji, H., Li, J., Lu, R., Gu, R., Cao, L., & Gong, X. (2016). EEG classification for hybrid brain-computer interface using a tensor based multiclass multimodal analysis scheme. Computational intelligence and neuroscience, 2016, 51. [DOI:10.1155/2016/1732836] [PMID] [PMCID]
  29. Khan, M. J., Hong, K. S., Naseer, N., Bhutta, M. R., & Yoon, S. H. (2014a). Hybrid EEG-NIRS BCI for rehabilitation using different-source brain signals. Paper presented at the Annual Conference of the Society of Instrument and Control Engineers (SICE), Sapporo, Japan, 9 September 2014. 
  30. Khan, M. J., Hong, M. J., & Hong, K. S. (2014b). Decoding of four movement directions using hybrid NIRS-EEG brain-computer interface. Frontiers in Human Neuroscience, 8, 244. [DOI:10.3389/fnhum.2014.00244] [PMID] [PMCID]
  31. Kiguchi, K., & Hayashi, Y. (2012). A study of EMG and EEG during perception-assist with an upper-limb power-assist robot. Paper presented at Robotics and Automation (ICRA), Saint Paul, 14-18 May 2012. [DOI:10.1109/ICRA.2012.6225027]
  32. Kim, M., Kim, B. H., & Jo, S. (2015). Quantitative evaluation of a low-cost noninvasive hybrid interface based on EEG and eye movement. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 23(2), 159-68. [DOI:10.1109/TNSRE.2014.2365834] [PMID]
  33. Kim, Y., & Jo, S. (2015). Wearable hybrid brain-computer interface for daily life application. Paper presented at 2015 3rd International Winter Conference on Brain-Computer Interface, Sabuk, South Korea, 12-14 January 2015.
  34. Koo, B., Lee, H. G., Nam, Y., Kang, H., Koh, C. S., Shin, H. C., et al. (2015a). A hybrid EOG-P300 BCI with dual monitors. Paper presented at 2014 International Winter Workshop on Brain-Computer Interface (BCI), Jeongsun-kun, South Korea, 17-19 February 2014.
  35. Koo, B., Lee, H. G., Nam, Y., Kang, H., Koh, C. S., Shin, H. C., et al. (2015b). A hybrid NIRS-EEG system for self-paced brain computer interface with online motor imagery. Journal of Neuroscience Methods, 244, 26-32. [DOI:10.1016/j.jneumeth.2014.04.016] [PMID]
  36. Lalitharatne, T. D., Teramoto, K., Hayashi, Y., & Kiguchi, K. (2013). Towards hybrid EEG-EMG-based control approaches to be used in bio-robotics applications: Current status, challenges and future directions. Paladyn, Journal of Behavioral Robotics, 4(2), 147-54. [DOI:10.2478/pjbr-2013-0009]
  37. Lee, M. H., Fazli, S., Mehnert, J., & Lee, S. W. (2014). Hybrid brain-computer interface based on EEG and NIRS modalities. Paper presented at 2014 International Winter Workshop on Brain-Computer Interface (BCI), Jeongsun-kun, South Korea, 17-19 February 2014. [DOI:10.1109/iww-BCI.2014.6782577]
  38. Leeb, R., Sagha, H., & Chavarriaga, R. (2010). Multimodal fusion of muscle and brain signals for a hybrid-BCI. Peper presented at 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August- 4 September 2010. [DOI:10.1109/IEMBS.2010.5626233] [PMID]
  39. Leeb, R., Sagha, H., Chavarriaga, R., & del R Millán, J. (2011). A hybrid brain-computer interface based on the fusion of electroencephalographic and electromyographic activities. Journal of Neural Engineering, 8(2), 025011. [DOI:10.1088/1741-2560/8/2/025011] [PMID]
  40. Li, J., Ji, H., Cao, L., Gu, R., Xia, B., & Huang, Y. (2013). Wheelchair control based on multimodal brain-computer interfaces. Peper presented at International Conference on Neural Information Processing, Daegu, Korea, 3-7 November 2013. [DOI:10.1007/978-3-642-42054-2_54]
  41. Li, J., Ji, H., Cao, L., Zang, D., Gu, R., Xia, B., et al. (2014). Evaluation and application of a hybrid brain computer interface for real wheelchair parallel control with multi-degree of freedom. International Journal of Neural Systems, 24(04), 1450014. [DOI:10.1142/S0129065714500142] [PMID]
  42. Lin, K., Chen, X., Huang, X., Ding, Q., & Gao, X. (2015). A hybrid BCI speller based on the combination of EMG envelopes and SSVEP. Heidelberg: Springer Berlin Heidelberg. [DOI:10.1186/s40535-014-0004-0]
  43. Lin, K., Cinetto, A., Wang, Y., Chen, X., Gao, S., & Gao, X. (2016). An online hybrid BCI system based on SSVEP and EMG. Journal of Neural Engineering, 13(2), 026020. [DOI:10.1088/1741-2560/13/2/026020] [PMID]
  44. Liu, Y. H., Wang, S. H., & Hu, M. R. (2016). A self-paced P300 healthcare brain-computer interface system with SSVEP-based switching control and kernel FDA+SVM-based detector. Applied Sciences, 6(5), 142. [DOI:10.3390/app6050142]
  45. Liu, H., Li, Y., Liu, H., & Wang, S. (2016). Bagging regularized common spatial pattern with hybrid motor imagery and myoelectric signal. Paper presented at 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Shanghai, China, 20-25 March 2016. [DOI:10.1109/ICASSP.2016.7471793]
  46. Long, J., Li, Y., Wang, H., Yu, T., Pan, J., & Li, F. (2012a). A hybrid brain computer interface to control the direction and speed of a simulated or real wheelchair. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 20(5), 720-9. [DOI:10.1109/TNSRE.2012.2197221] [PMID]
  47. Long, J., Li, Y., Yu, T., & Gu, Z. (2012b). Target selection with hybrid feature for BCI-based 2-D cursor control. IEEE Transactions on Biomedical Engineering, 59(1), 132-40. [DOI:10.1109/TBME.2011.2167718] [PMID]
  48. Ma, J., Zhang, Y., Cichocki, A., & Matsuno, F. (2015). A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: Application to robot control. IEEE Transactions on Biomedical Engineering, 62(3), 876-89. [DOI:10.1109/TBME.2014.2369483] [PMID]
  49. Ma, L., Zhang, L., Wang, L., Xu, M., Qi, H., Wan, B., et al. (2012). A hybrid brain-computer interface combining the EEG and NIRS. Peper presented at IEEE International Conference on Virtual Environments Human-Computer Interfaces and Measurement Systems (VECIMS) Proceedings, Tianjin, China, 2-4 July 2012. [DOI:10.1109/VECIMS.2012.6273214]
  50. Meena, Y., Prasad, G., Cecotti, H., & Wong-Lin, K. (2015). Simultaneous gaze and motor imagery hybrid BCI increases single-trial detection performance: A compatible incompatible study. Paper presented at 9th IEEE-EMBS International Summer School on Biomedical Signal Processing, Pavia, Italy, September 2015.
  51. Mercado, L., Rodríguez-Li-án, A., Torres-Trevi-o, L. M., & Quiroz, G. (2016). Hybrid BCI approach to control an artificial tibio-femoral joint. Paper presented at 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, 16-20 August 2016. [DOI: 10.1109/EMBC.2016.7591302]
  52. Müller-Putz, G. R., Leeb, R., Millán, J. D. R., Horki, P., Kreilinger, A., Bauernfeind, et al. (2012). Principles of hybrid brain-computer interfaces. In B.Z. Allison, S. Dunne, R. Leeb, J. D. R. Millán, A. Nijholt (Eds.), Towards Practical Brain-Computer Interfaces (pp. 355-373). Berlin: Springer.
  53. [DOI:10.1007/978-3-642-29746-5_18]
  54. Panicker, R. C., Puthusserypady, S., & Sun, Y. (2011). An asynchronous P300 BCI with SSVEP-based control state detection. IEEE Transactions on Biomedical Engineering 58(6), 1781-8. [DOI:10.1109/TBME.2011.2116018] [PMID]
  55. Peng, N., Zhang, R., Zeng, H., Wang, F., Li, K., Li, Y., et al. (2016). Control of a nursing bed based on a hybrid brain-computer interface. Paper presented at 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, 16-20 August 2016. [DOI:10.1109/EMBC.2016.7591008]
  56. Pfurtscheller, G., & Da Silva, F. L. (1999). Event-related EEG/MEG synchronization and desynchronization: basic principles. Clinical Neurophysiology, 110(11), 1842-57. [DOI:10.1016/S1388-2457(99)00141-8]
  57. Pfurtscheller, G., Solis-Escalante, T., Ortner, R., Linortner, P., & Muller-Putz, G. R. (2010). Self-paced operation of an SSVEP-Based orthosis with and without an imagery-based “brain switch:” A feasibility study towards a hybrid BCI. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 18(4), 409-14. [DOI:10.1109/TNSRE.2010.2040837]
  58. Pokorny, C., Breitwieser, C., & Müller-Putz, G. R. (2016). The role of transient target stimuli in a steady-state somatosensory evoked potential-based brain–computer interface setup. Frontiers in Neuroscience, 10, 152. [DOI:10.3389/fnins.2016.00152] [PMID]
  59. Punsawad, Y., Wongsawat, Y., & Parnichkun, M. (2010). Hybrid EEG-EOG brain-computer interface system for practical machine control. Paper presented at 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August- 4 September 2010. [DOI:10.1109/IEMBS.2010.5626745]
  60. Ramli, R., Arof, H., Ibrahim, F., Mokhtar, N., & Idris, M. Y. I. (2015). Using finite state machine and a hybrid of EEG signal and EOG artifacts for an asynchronous wheelchair navigation. Expert Systems with Applications, 42(5), 2451-63.[DOI:10.1016/j.eswa.2014.10.052]
  61. Rebsamen, B., Burdet, E., Zeng, Q., Zhang, H., Ang, M., Teo, C. L., et al. (2008). Hybrid P300 and Mu-Beta brain computer interface to operate a brain controlled wheelchair. Paper presented at Proceedings of the 2nd International Convention on Rehabilitation Engineering & Assistive Technology, Bangkok, Thailand, 13-15 May 2008.
  62. Rebsamen, B., Guan, C., Zhang, H., Wang, C., Teo, C., Ang, M. H., et al. (2010). A brain controlled wheelchair to navigate in familiar environments. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 18(6), 590-8.[DOI:10.1109/TNSRE.2010.2049862]
  63. Riccio, A., Holz, E. M., Aricò, P., Leotta, F., Aloise, F., Desideri, L., et al. (2015). Hybrid P300-based brain-computer interface to improve usability for people with severe motor disability: Electromyography signals for error correction during a spelling task. Archives of Physical Medicine and Rehabilitation, 96(3), S54-S61.
  64. [DOI:10.1016/j.apmr.2014.05.029]
  65. Riechmann, H., Hachmeister, N., Ritter, H., & Finke, A. (2011). Asynchronous, parallel on-line classification of P300 and ERD for an efficient hybrid BCI. Paper presented at 5th International IEEE/EMBS Conference on Neural Engineering, NER 2011, Cancun, Mexiko, 27 April 2011. [DOI:10.1109/NER.2011.5910574]
  66. Roula, M. A., Kulon, J., & Mamatjan, Y. (2012). Brain-computer interface speller using hybrid P300 and motor imagery signals. 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 24-27 June 2012. [DOI:10.1109/BioRob.2012.6290944]
  67. Savić, A., Kisić, U., & Popović, M. B. (2011). Toward a hybrid BCI for grasp rehabilitation. 5th European Conference of the International Federation for Medical and Biological Engineering, Berlin: Springer. [DOI:10.1007/978-3-642-23508-5_210]
  68. Severens, M., Farquhar, J., Duysens, J., & Desain, P. (2013). A multi-signature brain-computer interface: Use of transient and steady-state responses. Journal of Neural Engineering, 10(2), 026005. [DOI:10.1088/1741-2560/10/2/026005] [PMID]
  69. Shin, J., von Luhmann, A., Blankertz, B., Kim, D. W., Jeong, J., Hwang, H. J., et al. (2016). Open access dataset for EEG+ NIRS single-trial classification. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 25(10), 1735-45.
  70. Su, Y., Qi, Y., Luo, J. X., Wu, B., Yang, F., Li, Y., et al. (2011). A hybrid brain-computer interface control strategy in a virtual environment. Journal of Zhejiang University SCIENCE C, 12(5), 351-61. [DOI:10.1631/jzus.C1000208]
  71. Tomita, Y., Vialatte, F. B., Dreyfus, G., Mitsukura, Y., Bakardjian, H., & Cichocki, A. (2014). Bimodal BCI using simultaneously NIRS and EEG. IEEE Transactions on Biomedical Engineering, 61(4), 1274-84. [DOI:10.1109/TBME.2014.2300492]
  72. Usakli, A. B., Gurkan, S., Aloise, F., Vecchiato, G., & Babiloni, F. (2009). A hybrid platform based on EOG and EEG signals to restore communication for patients afflicted with progressive motor neuron diseases. Paper presented at Engineering in Medicine and Biology Society: Engineering the Future of Biomedicine, EMBC 2009, Minneapolis, 6 September 2009. [DOI: 10.1109/IEMBS.2009.5333742] [PMID]
  73. Wang, M., Daly, I., Allison, B. Z., Jin, J., Zhang, Y., Chen, L., et al. (2015). A new hybrid BCI paradigm based on P300 and SSVEP. Journal of Neuroscience Methods, 244, 16-25. [DOI:10.1016/j.jneumeth.2014.06.003]
  74. Wolpaw, J., & Wolpaw, E. W. (2012). Brain-computer interfaces: Principles and practice. Oxford: Oxford University Press. [Doi:10.1093/acprof:oso/9780195388855.001.0001]
  75. Xu, M., Qi, H., Wan, B., Yin, T., Liu, Z., & Ming, D. (2013). A hybrid BCI speller paradigm combining P300 potential and the SSVEP blocking feature. Journal of Neural Engineering, 10(2), 026001. [DOI:10.1088/1741-2560/10/2/026001] [PMCID]
  76. Yang, J., Su, X., Bai, D., Jiang, Y., & Yokoi, H. (2016). Hybrid EEG-EOG system for intelligent prosthesis control based on common spatial pattern algorithm. Paper presented at IEEE International Conference on Information and Automation (ICIA), Ningbo, China, 1-3 August 2016.
  77. Yao, L., Meng, J., Zhang, D., Sheng, X., & Zhu, X. (2014a). A novel hybrid BCI speller based on the incorporation of SSVEP into the P300 paradigm. Journal of Neural Engineering, 10(2), 026012.
  78. Yao, L., Meng, J., Zhang, D., Sheng, X., & Zhu, X. (2014b). Combining motor imagery with selective sensation toward a hybrid-modality BCI. IEEE Transactions on Biomedical Engineering, 61(8), 2304-12. [DOI:10.1109/TBME.2013.2287245]
  79. Yin, E., Zhou, Z., Jiang, J., Chen, F., Liu, Y., & Hu, D. (2014). A speedy hybrid BCI spelling approach combining P300 and SSVEP. IEEE Transactions on Biomedical Engineering, 61(2), 473-83. [DOI:10.1109/TBME.2013.2281976] [PMID]
  80. Yin, E., Zhou, Z., Jiang, J., Chen, F., Liu, Y., & Hu, D. (2013). A novel hybrid BCI speller based on the incorporation of SSVEP into the P300 paradigm. Journal of Neural Engineering, 10(2), 026012. [DOI:10.1088/1741-2560/10/2/026012] [PMID]
  81. Yu, Y., Jiang, J., Zhou, Z., Yin, E., Liu, Y., Wang, J., et al. (2016). A self-paced brain-computer interface speller by combining motor imagery and P300 potential. Paper presented at 8th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hangzhou, China, 27-28 August 2016.
  82. Yu, Y., Zhou, Z., Jiang, J., Yin, E., Liu, K., Wang, J., et al. (2016). Towards a hybrid BCI: Self-paced operation of a p300-based speller by merging a motor imagery-based ‘brain switch’into a P300 spelling approach. International Journal of Human-Computer Interaction, 33(8), 1-10.
  83. Zheng, W. L., & Lu, B. L. (2017). A multimodal approach to estimating vigilance using EEG and forehead EOG. Journal of Neural Engineering, 14(2), 026017.
Type of Study: Methodological Notes | Subject: Computational Neuroscience
Received: 2017/06/6 | Accepted: 2018/05/29 | Published: 2018/09/1

References
1. Ahn, S., Ahn, M., Cho, H., & Jun, S. C. (2013). hybrid brain-computer interface based on motor imagery and tactile selective attention. In N.L. Millán, R. Leeb, J.D.R. Millán, N. Guechoul, T. Carlson, A. Sobolewski, et al. (Eds.), Proceedings of TOBI Workshop IV (pp. 51-52). Switzerland: École polytechnique fédérale de Lausanne. [PMCID]
2. Faller, J. (2012). BCIs that use steady-state visual evoked potentials or slow cortical potentials. In J. Wolpaw, & E. Winter Wolpaw (Eds.), Brain-Computer Interfaces, Principles and Practice. Oxford: Oxford University Press.
3. Allison, B. Z., Brunner, C., Altstätter, C., Wagner, I. C., Grissmann, S., & Neuper, C. (2012). A hybrid ERD/SSVEP BCI for continuous simultaneous two dimensional cursor control. Journal of Neuroscience Methods, 209(2), 299-307. [DOI:10.1016/j.jneumeth.2012.06.022] [PMID] [DOI:10.1016/j.jneumeth.2012.06.022]
4. Allison, B. Z., Brunner, C., Kaiser, V., Müller-Putz, G. R., Neuper, C., & Pfurtscheller, G. (2010). Toward a hybrid brain-computer interface based on imagined movement and visual attention. Journal of Neural Engineering, 7(2), 026007. [DOI:10.1088/1741-2560/7/2/026007] [PMID] [DOI:10.1088/1741-2560/7/2/026007]
5. Al Shargie, F., Kiguchi, M., Badruddin, N., Dass, S. C., Hani, A. F. M., & Tang, T. B. (2016). Mental stress assessment using simultaneous measurement of EEG and fNIRS. Biomedical Optics Express, 7(10), 3882-98. [DOI:10.1364/BOE.7.003882] [PMID] [PMCID] [DOI:10.1364/BOE.7.003882]
6. Amiri, S., Rabbi, A., Azinfar, L., & Fazel-Rezai, R. (2013). A review of P300, SSVEP, and hybrid P300/SSVEP brain-computer interface systems. In R. Fazel-Rezai (Eds.), Brain-Computer Interface Systems-Recent Progress and Future Prospects. Intech Open. [DOI:10.5772/56135] [DOI:10.5772/56135]
7. Bai, O., Lin, P., Huang, D., Fei, D. Y., & Floeter, M. K. (2010). Towards a user-friendly brain-computer interface: Initial tests in ALS and PLS patients. Clinical Neurophysiology, 121(8), 1293-1303. [DOI:10.1016/j.clinph.2010.02.157] [PMID] [PMCID] [DOI:10.1016/j.clinph.2010.02.157]
8. Bharne, P. P., & Kapgate, D. (2015). Hybrid visual BCI combining SSVEP and P300 with high ITR and accuracy. International Journal of Computer Science and Mobile Computing, 4(6), 1-5.
9. Bhattacharyya, S., Konar, A., & Tibarewala, D. N. (2014). "Motor imagery, P300 and error-related EEG-based robot arm movement control for rehabilitation purpose. Medical & Biological Engineering & Computing, 52(12): 1007-17. [DOI:10.1007/s11517-014-1204-4] [PMID] [DOI:10.1007/s11517-014-1204-4]
10. Bi, L., Lian, J., Jie, K., Lai, R., & Liu, Y. (2014). A speed and direction-based cursor control system with P300 and SSVEP. Biomedical Signal Processing and Control, 14, 126-33. [DOI:10.1016/j.bspc.2014.07.009] [DOI:10.1016/j.bspc.2014.07.009]
11. Breitwieser, C., Pokorny, C., & Gernot, R. M. (2016). A hybrid three-class brain? Computer interface system utilizing SSSEPs and transient ERPs. Journal of Neural Engineering, 13(6), 066015. [DOI:10.1088/1741-2560/13/6/066015] [PMID] [DOI:10.1088/1741-2560/13/6/066015]
12. Brunner, C., Allison, B. Z., Altstätter, C., & Neuper, C. (2011). A comparison of three brain–computer interfaces based on event-related desynchronization, steady state visual evoked potentials, or a hybrid approach using both signals. Journal of Neural Engineering, 8(2), 025010. [DOI:10.1088/1741-2560/8/2/025010] [PMID] [DOI:10.1088/1741-2560/8/2/025010]
13. Buccino, A. P., Keles, H. O., & Omurtag, A. (2016). Hybrid EEG-fNIRS asynchronous brain-computer interface for multiple motor tasks. PloS One, 11(1), e0146610. [DOI:10.1371/journal.pone.0146610] [PMID] [PMCID] [DOI:10.1371/journal.pone.0146610]
14. Cao, L., Li, J., Ji, H., & Jiang, C. (2014). A hybrid brain computer interface system based on the neurophysiological protocol and brain-actuated switch for wheelchair control. Journal of Neuroscience Methods, 229, 33-43. [DOI:10.1016/j.jneumeth.2014.03.011] [PMID] [DOI:10.1016/j.jneumeth.2014.03.011]
15. Capati, F. A., Bechelli, R. P., & Castro, M. C. F. (2016). Hybrid SSVEP/P300 BCI keyboard. In J. Gilbert, H. Azhari, H. Ali, C. Quintão, J. Sliwa, C. Ruiz, et al., (Eds.), Proceedings of the International Joint Conference on Biomedical Engineering Systems and Technologies (pp. 214-8). Setúbal: SciTePress-Science and Technology Publications. [PMID] [PMID]
16. Chang, M. H., Lee, J. S., Heo, J., & Park, K. S. (2016). Eliciting dual-frequency SSVEP using a hybrid SSVEP-P300 BCI. Journal of Neuroscience Methods, 258, 104-13. [DOI:10.1016/j.jneumeth.2015.11.001] [PMID] [DOI:10.1016/j.jneumeth.2015.11.001]
17. Coyle, S., Ward, T., Markham, C., & McDarby, G. (2004). On the Suitability of Near-Infrared (NIR) systems for next-generation brain-computer interfaces. Physiological Measurement, 25(4), 815. [DOI:10.1088/0967-3334/25/4/003] [PMID] [DOI:10.1088/0967-3334/25/4/003]
18. Dong, X., Wang, H., Chen, Z., & Shi, B. E. (2015). Hybrid brain computer interface via Bayesian integration of EEG and eye gaze. Paper presented at the 17th International IEEE/EMBS Conference on Neural Engineering (NER), Montpellier, France, 2 July 2015. [DOI:10.1109/NER.2015.7146582] [DOI:10.1109/NER.2015.7146582]
19. Edlinger, G., Holzner, C., & Guger, C. (2011). A hybrid brain-computer interface for smart home control. Paper presented at the International Conference on Human-Computer Interaction, Las Vegas, 15-20 July 2018. [DOI:10.1007/978-3-642-21605-3_46] [DOI:10.1007/978-3-642-21605-3_46]
20. Évain, A., Argelaguet, F., Casiez, G., Roussel, N., & Lécuyer, A. (2016). Design and evaluation of fusion approach for combining brain and gaze inputs for target selection. Frontiers in Neuroscience, 10, 454. [DOI:10.3389/fnins.2016.00454] [PMID] [PMCID] [DOI:10.3389/fnins.2016.00454]
21. Fazel-Rezai, R., & Ahmad, W. (2011). P300-based brain-computer interface paradigm design. In U. Hoffmann, J.M. Vesin, T. Ebrahimi (Eds.), Recent Advances in Brain-Computer Interface Systems. IEEE Xplore Digital Library. https://doi.org/10.5772/579 [DOI:10.5772/14858]
22. Fazel Rezai, R., Allison, B. Z., Guger, C., Sellers, E. W., Kleih, S. C., & Kübler, A. (2012). P300 brain computer interface: Current challenges and emerging trends. Frontiers in Neuroengineering, 5, 14. [DOI:10.3389/fneng.2012.00014] [PMID] [PMCID] [DOI:10.3389/fneng.2012.00014]
23. Fazli, S., Mehnert, J., Steinbrink, J., Curio, G., Villringer, A., Müller, K. R., et al. (2012). Enhanced performance by a hybrid NIRS–EEG brain computer interface. Neuroimage, 59(1), 519-29. [DOI:10.1016/j.neuroimage.2011.07.084] [PMID] [DOI:10.1016/j.neuroimage.2011.07.084]
24. Finke, A., Knoblauch, A., Koesling, H., & Ritter, H. (2011). A hybrid brain interface for a humanoid robot assistant. Paper presented at 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, 30 August-3 September, 2011. [DOI:10.1109/IEMBS.2011.6091728] [PMID] [DOI:10.1109/IEMBS.2011.6091728]
25. Guger, C., Allison, B. Z., Großwindhager, B., Prückl, R., Hintermüller, C., Kapeller, C., et al. (2012). How many people could use an SSVEP BCI?. Frontiers in Neuroscience, 6, 169. [DOI:10.3389/fnins.2012.00169] [PMID] [PMCID] [DOI:10.3389/fnins.2012.00169]
26. Guger, C., Allison, B. Z., & Müller Putz, G. R. (2015). Brain-Computer Interface Research: A state-of-the-art summary 4. In Ch. Guger, B.Z. Allison, G. Edlinger (Eds.), Brain-Computer Interface Research (pp. 1-8). Berlin: Springer. [DOI:10.1007/978-3-319-25190-5] [DOI:10.1007/978-3-319-25190-5]
27. Herff, C., Fortmann, O., Tse, C. Y., Cheng, X., Putze, F., Heger, D., & Schultz, T. (2015). Hybrid fNIRS-EEG based discrimination of 5 levels of memory load. Paper presented at 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). Montpellier, France, 2-24 April 2015. [DOI:10.1109/NER.2015.7146546] [DOI:10.1109/NER.2015.7146546]
28. Ji, H., Li, J., Lu, R., Gu, R., Cao, L., & Gong, X. (2016). EEG classification for hybrid brain-computer interface using a tensor based multiclass multimodal analysis scheme. Computational intelligence and neuroscience, 2016, 51. [DOI:10.1155/2016/1732836] [PMID] [PMCID] [DOI:10.1155/2016/1732836]
29. Khan, M. J., Hong, K. S., Naseer, N., Bhutta, M. R., & Yoon, S. H. (2014a). Hybrid EEG-NIRS BCI for rehabilitation using different-source brain signals. Paper presented at the Annual Conference of the Society of Instrument and Control Engineers (SICE), Sapporo, Japan, 9 September 2014.
30. Khan, M. J., Hong, M. J., & Hong, K. S. (2014b). Decoding of four movement directions using hybrid NIRS-EEG brain-computer interface. Frontiers in Human Neuroscience, 8, 244. [DOI:10.3389/fnhum.2014.00244] [PMID] [PMCID] [DOI:10.3389/fnhum.2014.00244]
31. Kiguchi, K., & Hayashi, Y. (2012). A study of EMG and EEG during perception-assist with an upper-limb power-assist robot. Paper presented at Robotics and Automation (ICRA), Saint Paul, 14-18 May 2012. [DOI:10.1109/ICRA.2012.6225027] [DOI:10.1109/ICRA.2012.6225027]
32. Kim, M., Kim, B. H., & Jo, S. (2015). Quantitative evaluation of a low-cost noninvasive hybrid interface based on EEG and eye movement. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 23(2), 159-68. [DOI:10.1109/TNSRE.2014.2365834] [PMID] [DOI:10.1109/TNSRE.2014.2365834]
33. Kim, Y., & Jo, S. (2015). Wearable hybrid brain-computer interface for daily life application. Paper presented at 2015 3rd International Winter Conference on Brain-Computer Interface, Sabuk, South Korea, 12-14 January 2015. [DOI:10.1109/IWW-BCI.2015.7073029]
34. Koo, B., Lee, H. G., Nam, Y., Kang, H., Koh, C. S., Shin, H. C., et al. (2015a). A hybrid EOG-P300 BCI with dual monitors. Paper presented at 2014 International Winter Workshop on Brain-Computer Interface (BCI), Jeongsun-kun, South Korea, 17-19 February 2014.
35. Koo, B., Lee, H. G., Nam, Y., Kang, H., Koh, C. S., Shin, H. C., et al. (2015b). A hybrid NIRS-EEG system for self-paced brain computer interface with online motor imagery. Journal of Neuroscience Methods, 244, 26-32. [DOI:10.1016/j.jneumeth.2014.04.016] [PMID] [DOI:10.1016/j.jneumeth.2014.04.016]
36. Lalitharatne, T. D., Teramoto, K., Hayashi, Y., & Kiguchi, K. (2013). Towards hybrid EEG-EMG-based control approaches to be used in bio-robotics applications: Current status, challenges and future directions. Paladyn, Journal of Behavioral Robotics, 4(2), 147-54. [DOI:10.2478/pjbr-2013-0009] [DOI:10.2478/pjbr-2013-0009]
37. Lee, M. H., Fazli, S., Mehnert, J., & Lee, S. W. (2014). Hybrid brain-computer interface based on EEG and NIRS modalities. Paper presented at 2014 International Winter Workshop on Brain-Computer Interface (BCI), Jeongsun-kun, South Korea, 17-19 February 2014. [DOI:10.1109/iww-BCI.2014.6782577] [DOI:10.1109/iww-BCI.2014.6782577]
38. Leeb, R., Sagha, H., & Chavarriaga, R. (2010). Multimodal fusion of muscle and brain signals for a hybrid-BCI. Peper presented at 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August- 4 September 2010. [DOI:10.1109/IEMBS.2010.5626233] [PMID] [DOI:10.1109/IEMBS.2010.5626233]
39. Leeb, R., Sagha, H., Chavarriaga, R., & del R Millán, J. (2011). A hybrid brain-computer interface based on the fusion of electroencephalographic and electromyographic activities. Journal of Neural Engineering, 8(2), 025011. [DOI:10.1088/1741-2560/8/2/025011] [PMID] [DOI:10.1088/1741-2560/8/2/025011]
40. Li, J., Ji, H., Cao, L., Gu, R., Xia, B., & Huang, Y. (2013). Wheelchair control based on multimodal brain-computer interfaces. Peper presented at International Conference on Neural Information Processing, Daegu, Korea, 3-7 November 2013. [DOI:10.1007/978-3-642-42054-2_54] [DOI:10.1007/978-3-642-42054-2_54]
41. Li, J., Ji, H., Cao, L., Zang, D., Gu, R., Xia, B., et al. (2014). Evaluation and application of a hybrid brain computer interface for real wheelchair parallel control with multi-degree of freedom. International Journal of Neural Systems, 24(04), 1450014. [DOI:10.1142/S0129065714500142] [PMID] [DOI:10.1142/S0129065714500142]
42. Lin, K., Chen, X., Huang, X., Ding, Q., & Gao, X. (2015). A hybrid BCI speller based on the combination of EMG envelopes and SSVEP. Heidelberg: Springer Berlin Heidelberg. [DOI:10.1186/s40535-014-0004-0] [DOI:10.1186/s40535-014-0004-0]
43. Lin, K., Cinetto, A., Wang, Y., Chen, X., Gao, S., & Gao, X. (2016). An online hybrid BCI system based on SSVEP and EMG. Journal of Neural Engineering, 13(2), 026020. [DOI:10.1088/1741-2560/13/2/026020] [PMID] [DOI:10.1088/1741-2560/13/2/026020]
44. Liu, Y. H., Wang, S. H., & Hu, M. R. (2016). A self-paced P300 healthcare brain-computer interface system with SSVEP-based switching control and kernel FDA+SVM-based detector. Applied Sciences, 6(5), 142. [DOI:10.3390/app6050142] [DOI:10.3390/app6050142]
45. Liu, H., Li, Y., Liu, H., & Wang, S. (2016). Bagging regularized common spatial pattern with hybrid motor imagery and myoelectric signal. Paper presented at 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Shanghai, China, 20-25 March 2016. [DOI:10.1109/ICASSP.2016.7471793] [DOI:10.1109/ICASSP.2016.7471793]
46. Long, J., Li, Y., Wang, H., Yu, T., Pan, J., & Li, F. (2012a). A hybrid brain computer interface to control the direction and speed of a simulated or real wheelchair. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 20(5), 720-9. [DOI:10.1109/TNSRE.2012.2197221] [PMID] [DOI:10.1109/TNSRE.2012.2197221]
47. Long, J., Li, Y., Yu, T., & Gu, Z. (2012b). Target selection with hybrid feature for BCI-based 2-D cursor control. IEEE Transactions on Biomedical Engineering, 59(1), 132-40. [DOI:10.1109/TBME.2011.2167718] [PMID] [DOI:10.1109/TBME.2011.2167718]
48. Ma, J., Zhang, Y., Cichocki, A., & Matsuno, F. (2015). A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: Application to robot control. IEEE Transactions on Biomedical Engineering, 62(3), 876-89. [DOI:10.1109/TBME.2014.2369483] [PMID] [DOI:10.1109/TBME.2014.2369483]
49. Ma, L., Zhang, L., Wang, L., Xu, M., Qi, H., Wan, B., et al. (2012). A hybrid brain-computer interface combining the EEG and NIRS. Peper presented at IEEE International Conference on Virtual Environments Human-Computer Interfaces and Measurement Systems (VECIMS) Proceedings, Tianjin, China, 2-4 July 2012. [DOI:10.1109/VECIMS.2012.6273214] [DOI:10.1109/VECIMS.2012.6273214]
50. Meena, Y., Prasad, G., Cecotti, H., & Wong-Lin, K. (2015). Simultaneous gaze and motor imagery hybrid BCI increases single-trial detection performance: A compatible incompatible study. Paper presented at 9th IEEE-EMBS International Summer School on Biomedical Signal Processing, Pavia, Italy, September 2015.
51. Mercado, L., Rodríguez-Li-án, A., Torres-Trevi-o, L. M., & Quiroz, G. (2016). Hybrid BCI approach to control an artificial tibio-femoral joint. Paper presented at 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, 16-20 August 2016. [DOI: 10.1109/EMBC.2016.7591302] [DOI:10.1109/EMBC.2016.7591302]
52. Müller-Putz, G. R., Leeb, R., Millán, J. D. R., Horki, P., Kreilinger, A., Bauernfeind, et al. (2012). Principles of hybrid brain-computer interfaces. In B.Z. Allison, S. Dunne, R. Leeb, J. D. R. Millán, A. Nijholt (Eds.), Towards Practical Brain-Computer Interfaces (pp. 355-373). Berlin: Springer.
53. [DOI:10.1007/978-3-642-29746-5_18] [DOI:10.1007/978-3-642-29746-5_18]
54. Panicker, R. C., Puthusserypady, S., & Sun, Y. (2011). An asynchronous P300 BCI with SSVEP-based control state detection. IEEE Transactions on Biomedical Engineering 58(6), 1781-8. [DOI:10.1109/TBME.2011.2116018] [PMID] [DOI:10.1109/TBME.2011.2116018]
55. Peng, N., Zhang, R., Zeng, H., Wang, F., Li, K., Li, Y., et al. (2016). Control of a nursing bed based on a hybrid brain-computer interface. Paper presented at 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, 16-20 August 2016. [DOI:10.1109/EMBC.2016.7591008] [DOI:10.1109/EMBC.2016.7591008]
56. Pfurtscheller, G., & Da Silva, F. L. (1999). Event-related EEG/MEG synchronization and desynchronization: basic principles. Clinical Neurophysiology, 110(11), 1842-57. [DOI:10.1016/S1388-2457(99)00141-8] [DOI:10.1016/S1388-2457(99)00141-8]
57. Pfurtscheller, G., Solis-Escalante, T., Ortner, R., Linortner, P., & Muller-Putz, G. R. (2010). Self-paced operation of an SSVEP-Based orthosis with and without an imagery-based "brain switch:" A feasibility study towards a hybrid BCI. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 18(4), 409-14. [DOI:10.1109/TNSRE.2010.2040837] [DOI:10.1109/TNSRE.2010.2040837]
58. Pokorny, C., Breitwieser, C., & Müller-Putz, G. R. (2016). The role of transient target stimuli in a steady-state somatosensory evoked potential-based brain–computer interface setup. Frontiers in Neuroscienc

Add your comments about this article : Your username or Email:
CAPTCHA

Send email to the article author


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

© 2024 CC BY-NC 4.0 | Basic and Clinical Neuroscience

Designed & Developed by : Yektaweb