'' 'INTERREGNUM WORLD'S
INTERNET-ING ! ' ''
THE FUTURE OF ANOTHER TIMELINE :
A better Internet is waiting for us, as public life has been irrevocably changed by social media. and in the coming future by ''The World Students Society''.
Now it's time for something else. We need to stop handing off responsibility for maintaining public space to corporations and algorithms and give it back to human beings.
It's all too easy to imagine an app that uses an algorithm to help choose ''appropriate'' friends for us, or select our news.
This is where curation might go wrong, says Safiya Umoja Noble, a professor at the University of California at Los Angeles. She's the author of the groundbreaking work ''Algorithms of Oppression,'' and was one of the first researchers to warn the public about bias in algorithms.
She identified about how data from social media platforms gets fed into the algorithms, amplifying human biases about everything from race to politics.
Ms. Noble found for example, that a Google image search for ''beautiful'' turned up predominantly young white women, and searches for news turned up conspiracy theories.
Nevertheless, Facebook uses algorithms to suggest stories to us. Advertisers use those algorithms to figure out what we'd like to buy. Search engines use them to figure out the most most relevant information for us.
When she thinks about the future, Ms. Noble imagines a counterintuitive and elegantly simple solution to the algorithm problem.
She calls it ''slow media''. As Ms. Noble said : ''Right now, we know billions of items per day are uploaded into Facebook. With that volume of content, it's impossible for the platform to look at all of it and determine whether it should be there or not.''
Trying to keep up with this torrent, media companies have used algorithms to stop the spread of abusive or misleading information.
But so far they haven't helped much. Instead of deploying algorithms to curate content at superhuman speeds, what if future public platforms simply set limits om how quickly content circulates? It would be a much different media experience.
''Maybe you'll submit something and it won't show up the next minute,'' Ms. Noble said. ''That might be positive. Maybe we'll upload things and come back in a week and see if it's there.''
A better Internet is waiting for us, as public life has been irrevocably changed by social media. and in the coming future by ''The World Students Society''.
Now it's time for something else. We need to stop handing off responsibility for maintaining public space to corporations and algorithms and give it back to human beings.
It's all too easy to imagine an app that uses an algorithm to help choose ''appropriate'' friends for us, or select our news.
This is where curation might go wrong, says Safiya Umoja Noble, a professor at the University of California at Los Angeles. She's the author of the groundbreaking work ''Algorithms of Oppression,'' and was one of the first researchers to warn the public about bias in algorithms.
She identified about how data from social media platforms gets fed into the algorithms, amplifying human biases about everything from race to politics.
Ms. Noble found for example, that a Google image search for ''beautiful'' turned up predominantly young white women, and searches for news turned up conspiracy theories.
Nevertheless, Facebook uses algorithms to suggest stories to us. Advertisers use those algorithms to figure out what we'd like to buy. Search engines use them to figure out the most most relevant information for us.
When she thinks about the future, Ms. Noble imagines a counterintuitive and elegantly simple solution to the algorithm problem.
She calls it ''slow media''. As Ms. Noble said : ''Right now, we know billions of items per day are uploaded into Facebook. With that volume of content, it's impossible for the platform to look at all of it and determine whether it should be there or not.''
Trying to keep up with this torrent, media companies have used algorithms to stop the spread of abusive or misleading information.
But so far they haven't helped much. Instead of deploying algorithms to curate content at superhuman speeds, what if future public platforms simply set limits om how quickly content circulates? It would be a much different media experience.
''Maybe you'll submit something and it won't show up the next minute,'' Ms. Noble said. ''That might be positive. Maybe we'll upload things and come back in a week and see if it's there.''
That slowness would give human moderators or curators time to review content.
They could quash dangerous conspiracy theories before they lead to harassment or worse. Or they could behave like old-fashioned newspaper editors, fact-checking content with the people posting it or making sure they have permission to post pictures of someone.
''It might help accomplish privacy goals, or give consumers better control,'' Ms. Noble said. ''It's a completely different business model.''
The key to slow media is that it puts humans back in control of the information they share.
Before I chucked algorithms out altogether, I wanted to find out what our future media might look like if we let algorithms take over fully.
So I contacted Janelle Shane, an algorithm designer and author of a book about [and named by] A.I., ''You Look Like a Thing and I Love You.''
She has spent years creating humorous art with OpenAI's GPT- algorithm, a neural network that can predict the next word in text after learning from eight-million web pages.
I asked Ms. Shane whether her algorithm could give us some text that might reveal something about the future after social media.
She prompted the algorithms by feeding in the terms of service from Second Life, a virtual reality social network.
The Honor and Serving of the Latest Operational Research and writing on the 'Internet and Future' continues.
With respectful dedication to the Scientists, Thinkers, Technologists, Students, Professors and Teachers of the world.
See Ya all prepare and register for Great Global Elections on : The World Students Society : wssciw.bogspot.com and Twitter - !E-WOW! - the Ecosystem 2011 :
''' Science - Writers '''
They could quash dangerous conspiracy theories before they lead to harassment or worse. Or they could behave like old-fashioned newspaper editors, fact-checking content with the people posting it or making sure they have permission to post pictures of someone.
''It might help accomplish privacy goals, or give consumers better control,'' Ms. Noble said. ''It's a completely different business model.''
The key to slow media is that it puts humans back in control of the information they share.
Before I chucked algorithms out altogether, I wanted to find out what our future media might look like if we let algorithms take over fully.
So I contacted Janelle Shane, an algorithm designer and author of a book about [and named by] A.I., ''You Look Like a Thing and I Love You.''
She has spent years creating humorous art with OpenAI's GPT- algorithm, a neural network that can predict the next word in text after learning from eight-million web pages.
I asked Ms. Shane whether her algorithm could give us some text that might reveal something about the future after social media.
She prompted the algorithms by feeding in the terms of service from Second Life, a virtual reality social network.
The Honor and Serving of the Latest Operational Research and writing on the 'Internet and Future' continues.
With respectful dedication to the Scientists, Thinkers, Technologists, Students, Professors and Teachers of the world.
See Ya all prepare and register for Great Global Elections on : The World Students Society : wssciw.bogspot.com and Twitter - !E-WOW! - the Ecosystem 2011 :
''' Science - Writers '''
Good Night and God Bless
SAM Daily Times - the Voice of the Voiceless
0 comments:
Post a Comment
Grace A Comment!