Because we are in a time when all the concepts that inspired us are rapidly becoming accessible: we could call it a campaign, but it's become our state of mind.
No matter what big scientifical words we're using, at the end of the day this is what we're doing: we're replacing pressing buttons with controlled thoughts.
So we'll make sure that the science behind it will fuel all the dreams we had from when we were kids: focus on that (virtual) glass, make it move.
The Visual Cortex takes care of what we see.
In short, light enters our eyes in the form of photons, which trigger small electrico-chemical signals to travel all the way to the back of the head - the Visual Cortex - which processes the data and redirects it around the brain accordingly, based on its purpose and urgency.
I'll ask you to look at this image and then close your eyes and try to recall it.
Do you think the Visual Cortex had any work to do, this second time? It turns out it does!
Just like imagined movements have an effect on the Motor Cortex, visual imagery has an effect on the Visual Cortex. And a similar one to seeing the real thing.
Even more, according to this study, it seems that different portions of the Visual Cortex activate based on the imagined visual's attributes, such as size.
As a side note: even Imagined Speech has an effect on the whole speech apparatus, this leading to the possibility of telepathic communication in the near future.
As far as non-invasive technology goes, we can definitely read signals from the Visual Cortex and most other areas around the brain.
Visually Evoked Potentials have been used for many, many years for selecting flashing objects on a computer screen, based on the synchronisation between the flashes and the response within the Visual Cortex.
That's something like trying to identify a planet based solely on how fast it rotates around its axis. It's feasible in the context of our Solar System, but that's about it.
A recent research paper started creating waves in the Neurotech community.
Dream Diffusion: Generating High-Quality Images from Brain EEG Signals is the natural occuring step in the journey of bringing AI closer to Neurotech.
While previous approaches tried to build a context from small lego blocks, current approaches - let's call them top-down - will aim to match whole abstract ideas to existing similar ones.
Let's take for example our previous image: NeuralEcho Labs' logo, placed on top of digits "250".
One way to reproduce it from EEG signals would be to find a pattern for individual features such as "2", "5": their curvature, angles, colours and positions, and then construct the image "brick-by-brick". That would be ideal, however we still have some way to get there.
Another way is to go higher-level, and compare parts of the image with a database of parts of images. When we find a high match, we'll start putting together the final image.
Given our imagination didn't go too far from the limits of our database, our reproduced images will match at least at the conceptual level.
I had a chat a while ago with an inspiring young lady who wants to pursue her dream of making a device that reads dreams. 10 years ago that would sound like complete Sci-Fi. Today? Much more Sci, much less Fi. If you'd like to reach out and help her, I'm sure Hillary's LinkedIn profile is the best way.
Many philosphers say that dreams are our authentic self's way of telling us what's wrong, what's right. Looking at a video of it might convince us better to act on it.
Say Adventure/RPG games: "In order to open this door, remember what Lord Jonas gave you at the beginning of your journey". Remembering is imagining - it makes sense, right?
Let's think about First Person Shooters: You're almost out of ammo and are already engaged with all your fingers. Think about reloading. And it's done.
I'm sure you can think of many more use-cases.
I get home, I'm hungry, I'm thinking about a cheeseburger. My smart-home assistant interrupts: "Unfortunately you're on a diet, try to constrain yourself. There is plenty of fresh broccoli in the fridge."
Future me will be thankful.
Every day in our lives there are plenty of moments where thoughts are not followed by actions. Many times we're missing on big things. So why not have a helping hand.
You think about starting the washing machine, but someone's at the door and you forget. Next morning you find there are no clean shirts. We can avoid that, right?
Clearly, any amazingly beneficial technology can be used in equally negative ways.
There's not much to do about it, but document yourself. Know what is possible, know what is not. The only way others will turn good into bad is through our own ignorance.
The article's poster references the film Inception - where dream "surfers" fight to bend reality.
The big gamer in the middle is the you and the me. We got this.
One more thing: everywhere we turn our heads we'll find an ancient book, a modern best-seller or videos that recommend people to visualise their wishes if they want them a reality. There is truth in that and I'm sure this a step in understanding more about what today seems magic.