Schoolchildren everywhere (and maybe even some unlucky Yale undergraduates) will soon be penning some version of the “What I did on summer vacation” essay. For accuracy, I should probably say “typing.” And maybe I should define “vacation,” too. In fact, the ability to recall one’s adventures may very well reflect the writer’s “vacation” from said typing device, whether it be a computer or a smartphone, or probably both.

Well, perhaps. (This Internet-addicted writer, for one, has not forgotten the concept of hyperbole.) But recent neuroscience research indicates that minds remember and learn best when given time to process experiences — time often gobbled up by iPhones and BlackBerries.

In one particular experiment, researchers in the laboratory of Loren Frank at University of California, San Francisco, placed rats in a W-shaped box and recorded their brain activity while in these boxes and also later in a resting box. Specific clusters of nerve cells in the hippocampus, the region of the brain responsible for long-term memory formation, were activated when the rats were in the W-shaped boxes. These same neurons were then reactivated while the rats were in the resting box, indicating that the rats were replaying their experiences. Contrary to the prevailing idea that sleep is when most memories are consolidated, reactivation of the nerve cells occurred more often when rats were awake and resting than when asleep, suggesting that these periods of rest while awake are also very important for learning.

The research is by no means conclusive. It has yet to be shown in rats, let alone in people constantly checking their e-mail, that distractions following an event prevent memory formation. It seems, however, reasonable to expect that if we are continuously consuming media, we may inadvertently fail to hit our preprogrammed “replay”button. Even if we make an effort to get adequate sleep, we may be decreasing our capacity for learning if we don’t allow our brains to rest while awake.

Now back to that summer vacation essay. The more obvious technological impediment for most students with a writing assignment is how to focus long enough to compose a coherent piece of work, ignoring the distractions offered by a standard Web browser. First, there’s the requisite status update on Facebook: “Channeling Ralph Waldo Emerson.” Then a quick instant message to a friend, followed by an e-mail inspection (respond to one, trash the other).

Neuroscientists and psychologists are beginning to study this world of multitasking and how it ties in with our current understanding of cognition as it becomes more and more relevant to everyday life. (Ironically, multitasking itself is a word borrowed from computer science.)

In the most definitive work on this topic so far, a group at Stanford University administered a filtering game to questionnaire-identified “heavy” and “light” multitaskers. The game asked participants to say whether red rectangles changed orientation between two consecutively flashed images, while “distractor’”blue rectangles tried to grab their attention. The “heavy” multitaskers were worse at this game as well as at a task-switching test, in which they were expected to excel.

Although I haven’t taken the study’s multitasking questionnaire, like many Yalies, I suspect I am a medium- to high-level multitasker. I always actively avoid boredom and am rarely without something to listen to or read. But in light of this new research, as an Internet denizen, I have to wonder if I’m changing the brain processes involved in encoding information for long-term storage. Am I doing more harm than good?

It turns out my summer vacation included moving to a new apartment ­ one without an Internet connection. I’m still online every day (this is by no means a detox), but I’ve found that I read more literature and spend more time in the quiet, just thinking. I’ll probably restore my home Internet connection within the next few months, but as we return to a new semester, I’ll be thinking about how I think and maybe take more frequent breaks from this plugged-in reality.