Editor's note: Adam Ostrow is editor-in-chief of Mashable. He spoke at the TED Global conference in Edinburgh, UK, in July. TED is a nonprofit dedicated to "Ideas worth spreading," through talks which it makes available on its website.
(CNN) -- How much do you know about your great-grandparents? In most cases, the answer to that question is "Not much." But that's something that will be forever changed as a result of the hundreds of thousands of pieces of digital content the average person will produce in his or her lifetime.
Think about it -- while at best you might have a few photos, newspaper clippings or secondhand accounts of your ancestors, our descendants and all those to follow will have at their fingertips a deep digital archive of information that we created ourselves.
That's just one of the many things that started occurring to me earlier this year when I began to ponder what the social media revolution means for our digital legacies.
But my belief isn't just that the content we're creating is going to leave behind an interesting first-person account of each of us that will create a new dynamic of understanding our past for future generations. Ultimately, I think we're heading toward something far more intriguing, prominent and potentially dangerous.
TED.com: How to make a splash in social media
That's because I think as the quantity of content we're producing and technology's ability to make sense of it continue to expand exponentially, it will inevitably become possible to not only define our own legacies, but to recreate very lifelike representations of ourselves.
Here's why: On one hand, you have people creating media in huge numbers -- already the average Facebook user is sharing 90 pieces of information per month, ranging from status updates, to photos, to videos, to links to a trail of the places they've been. Over the course of a lifetime, that's a tremendous amount of personal information and insight into how people think, how they act, and whom they interact with.
On the other hand, you have all of that data being indexed in the cloud. You have other areas of technology -- like machine learning -- expanding in capability to the point where artificial intelligence can trump humans in a game of Jeopardy. There are now robots that are scratching the surface of being able to understand human emotion. And social media -- still just a decade old -- will continue to evolve and offer far more robust ways of interacting and projecting oneself digitally.
Combine these two powerful forces, and inevitable technological change that goes far beyond what most of us are able to comprehend, and you end up at the scenario that I presented at TED -- lifelike representations of ourselves, interacting in the real world based on the content we created during our lifetimes, long after we're gone.
But do we want holographic representations of ourselves -- something we've already seen enabled through technology like TelePresence -- living on forever? Will being able to recreate our loved ones make it harder to find closure? What types of rules should govern the commercialization of something many would pay anything for? These are the types of questions our society will be dealing with in the years to come.
TED.com: Compassion and the true meaning of empathy
Already, some people are taking control of their postmortem plans for the digital world. And entrepreneurs are creating a nascent industry around the issue, with services that allow people to do everything from appointing a "digital executor" to preparing messages that get posted upon death.
Meanwhile, as we start to see social media users perish, millions of profiles, walls and blogs are becoming online memorials.
Thinking about what happens to our online identities after we die may not be a comfortable topic, akin to buying life insurance or putting organ donor status on your driver's license. But it may become one of the most important decisions we make, as the way we're remembered becomes forever linked to our digital persona -- perhaps in ways far beyond what we imagine.
The opinions expressed in this commentary are solely those of Adam Ostrow.