(CNN) -- The meal you ate the first day you started working. The first exam you aced in high school. The shoes you wore to the prom.
These minute details of life often fade into the abyss of memory, which is not a perfect scrapbook of every experience. Over time, we forget details of events that happened long ago or even mis-remember them.
But today's technology creates opportunities for greater, moment-by-moment record-keeping. Archives of your blog, Facebook or Twitter feed -- both in text and in pictures -- might reveal exactly what you ate on important occasions, the papers you were proud of and the outfits you wore.
Microsoft is developing a camera that takes this further: SenseCam, which automatically captures photos of everything you see and do all day. There are even people, such as Microsoft researcher Gordon Bell, who go around with audio and video equipment to create a more complete supplement to biological memory.
If we rely on technology for documenting, sorting and storing information -- creating digital diaries, or "lifestreaming" -- what will become of our minds? Although there is not a lot of research on this subject, psychologists have a range of opinions about where we're headed.
Using computers to outsource many mental tasks isn't necessarily bad as long as the brain is being stimulated in other ways, said David Bucci, associate professor of psychology at Dartmouth College. With the advent of the calculator, the need to memorize multiplication tables became less important, and hand-held devices such as smart phones and PDAs have eliminated the need to remember phone numbers and to-do lists.
But none of these task-replacing devices has been shown to damage the mind. Rather, they allow people to devote more brain resources to other things, he said.
Learning how to use technologies is good for the brain, just like learning a language or doing puzzles, he said. There is also evidence that physical activity helps stave off dementia.
"The key thing is that the sedentary brain, just like the sedentary body, is going to atrophy," he said.
Although people may lose interest in services such as Twitter, one day, recording everything may take a lot less effort. In fact, it's not inconceivable that a microchip could be implanted in the brain that would be used to make external copies of memory, said Dr. Gary Small, director of the UCLA Center on Aging and co-author of the book "iBrain: Surviving the Technological Alteration of the Modern Mind."
On the plus side, this would help patients with Alzheimer's disease who have begun to forget key elements of their existence, he said. Presenting them their own memories could add "another 10 years of cognitive life," he said.
This could work for those in the early stages of the disease, but eventually there comes a point where people with Alzheimer's no longer recognize their lives as their own, said Barry Schwartz, professor of social action and social theory at Swarthmore College in Swarthmore, Pennsylvania.
Relying on digital documents, moreover, may take away from the learning process, Schwartz said.
For years, he gave take-home exams for which students could use any reference materials they wished, but it became apparent that they did not actually internalize the material.
These days, Schwartz gives exams that are closed-book. Students just have to remember what they learned.
"You can't walk around through life carrying all of your books under your arm," he said. "It's possible that these devices that people now rely on will discourage anyone from doing this disciplined learning that we always used to do."
Relying on the Internet for answers does have its advantages in terms of brain stimulation. A recent study from Small's group at UCLA found that Internet searching among middle-aged and elderly adults who don't have much experience with the Web had increased activity in key areas of the brain after searching the Web for an hour each day for two weeks.
But recording everything you do takes people out of the "here and now," psychologists say. Constant documenting may make people less thoughtful about and engaged in what they're doing because they are focused on the recording process, Schwartz said.
Moreover, if these documented memories are available to others, people may actually do things differently.
"If we have experiences with an eye toward the expectation that in the next five minutes, we're going to tweet them, we may choose difference experiences to have, ones that we can talk about rather than ones we have an interest in," he said.
Similarly, a 1993 study led by researchers at the University of Virginia found that undergraduate students who were asked to think about their reasons for choosing posters chose differently and reported less satisfaction than those who did not have to justify their choices.
Since memories are not perfectly accurate depictions of situations and get fuzzier over time, they may be problematic in situations such as being a witness at a trial, Schwartz said. But there are also benefits to this fuzziness: It helps people come up with coherent narratives about what their lives are about, he said.
Being able to compress a lot of experiences and summarize them well is part of the very nature of human intelligence, said Douglas Hofstadter, professor of cognitive science at Indiana University, Bloomington, and author of "Godel, Escher, Bach: An Eternal Golden Braid."
"It's about finding the essence of things," he said. "It's not about restoring everything. It's about reducing things in complexity until they're manageable and understandable."
Hofstadter did keep a diary from age 19 to 25 and once in a while will go back and read a hundred handwritten pages. But each entry is short and does not capture life decades ago in real-time; it just sums it up.
"Are you likely to spend a week of your life when you're 60 looking at your life when you're 20? No," he said.