A cursor on a computer screen can be controlled using thoughts about a range of vowel sounds, research has found.
Brain signals have been translated into motion or even pictures before, but the current research showcases a nascent technique called electrocorticography.
The approach uses sensors placed directly on the surface of the brain.
The authors of the Journal of Neural Engineering paper say the technique will lead to better "brain-computer interfaces" for the disabled.
A great many studies and demonstrations have in recent years made use of the electroencephalograph, or EEG, typically worn as a "cap" studded with electrodes that pick up the electric fields produced by firing neurons.
The technique has been shown to guide electric wheelchairs or even toys, based only on the wearer's intention.
However, EEGs lose a great deal of the precious information that is available closer to the brain itself, what lead author of the study Eric Leuthardt, of Washington University in St Louis, in the US, calls the "gold standard" brain signal.
"You cannot get the spatial or the signal resolution," he told BBC News.
"One of the key features in signal resolution is seeing the higher frequencies of brain activity - those higher frequencies have a substantial capability of giving us better insights into cognitive intentions, and part of the reason EEG suffers for this is it acts as a filter of all of these high frequency signals."
That is, the EEG picks up signals outside the skull, which acts to absorb and muddle the signals.
Electrocorticography, by contrast, is so named because it taps directly into the brain's cortex - the outermost layer of the brain.
In a surgical procedure, a plastic pad containing a number of electrodes is implanted under the skull.
Its power has already been shown off in allowing video game play by thought alone - but in the new study, the researchers have tapped into the speech network of the brain.
Prior studies have made use of the motor control signals in the brain: the thought or will to move in a particular direction.
But Dr Leuthardt said that the units of speech known as phonemes allow signals of a particular "discrete" nature, rather than signals that range in intensity, as with thoughts of motion.
"(It's) for the same reason that you don't type a paper with a mouse - you have a keyboard with a number of discrete commands," he explained.
"We would want to facilitate somebody's abilty to communicate by having different phonemes - or essentially key presses - that could allow them to have discrete type of control."
Four patients who were already undergoing the electrocorticograph implantation - to establish the source of incurable epileptic seizures - participated in the latest study.
They were asked to think of four different phonemes - "oo", "ah", "ee" and "eh" - and their brain signals were recorded. Those higher-frequency signals were shown to reliably move a cursor on a computer screen.
"Do we need that gold standard to get this simple level of control? I think the likely answer is yes," Dr Leuthardt explained.
"For a brain-computer interface, especially for someone who is severly impaired, they need something that is absolutely, completely reliable. If you think of EEG (systems), they move, they're susceptible to noise, and the likelihood for reliablity is much lower."
Just a few discrete but reliable signals - tantamount to being able to move a cursor in two dimensions and effect a "click" - could lead to a vast number of applications, he continued.
"What is one of the most prolific '2D-plus-click' devices we have today? It's an Iphone. Once you have 2D plus click... there's innumerable different types of functionality you can create on an application base - but what you first need is the control."
The study also showed that the large-area arrays utilised for the epilepsy research would not be necessary for future electrocorticography implants; an area just 4mm by 4mm can provide the same level of information.