By learning how the brain reacts to external visual stimuli and examining possible triggered brain statuses, we conduct a systematic study on an encoding problem that estimates ongoing EEG dynamics from visual information. A novel generalized system is proposed to encode the alpha oscillations modulated during video viewing by employing the visual saliency involved in the presented natural video stimuli. Focusing on the parietal and occipital lobes, the encoding effects at different alpha frequency bins and brain locations are examined by a real-valued genetic algorithm (GA), and possible links between alpha features and saliency patterns are constructed. The robustness and reliability of the proposed system are demonstrated in a 10-fold cross-validation. The results show that stimuli with different saliency levels can induce significant changes in occipito-parietal alpha oscillations and that alpha at higher frequency bins responded the most in involuntary attention related to bottom-up-based visual processing. This study provides a novel approach to understand the processing of involuntary attention in the brain dynamics and would further be beneficial to the development of brain-computer interfaces and visual design.