Facebook Challenge -Are They Listening?
Help us in an experiment. The rules are simple:
Pick a subject. The couple in the video chose cat food.
It can be anything, but a retail item is best. Something you would be looking to buy or a place to visit.
Something you don’t really need, and know you haven’t researched or spoken of.
Then, just spend some time talking about your subject, with your phone beside you.
Try to do it in a car or outside, away from laptops and smart t.v.’s.
Do not type your subject, into any computer or text.
Be patient. Wait and see if an ad for your subject comes across your Facebook feed, within the next week.
Please be sure tell us your results.
Did Facebook’s experiment violate ethics?
Facebook had subjected nearly 700,000 users in an experiment without their knowledge, manipulating these individuals’ news feeds, reducing positive or negative content, and examining the emotions of these individuals’ subsequent posts.
Facebook essentially sought to manipulate people’s mood. This is not a trivial undertaking. What if a depressed person became more depressed? Facebook says that the effect wasn’t large, but it was large enough for the authors to publish the study in a major science journal.
This experiment is scandalous and violates accepted research ethics.
Facebook Tinkers With Users’ Emotions in News Feed Experiment, Stirring Outcry
To Facebook, we are all lab rats.
Facebook routinely adjusts its users’ news feeds — testing out the number of ads they see or the size of photos that appear — often without their knowledge. It is all for the purpose, the company says, of creating a more alluring and useful product.
But last week, Facebook revealed that it had manipulated the news feeds of over half a million randomly selected users to change the number of positive and negative posts they saw. It was part of a psychological study to examine how emotions can be spread on social media.
Facebook’s new face recognition knows you from the side
Facebook still isn’t as good as humans when it comes to identifying people in photos, but it’s getting awfully close.
Facebook researchers published a paper last month in which they detailed the capabilities of a new artificial intelligence system known as “DeepFace.” When asked whether two photos show the same person, DeepFace answers correctly 97.25% of the time; that’s just a shade behind humans, who clock in at 97.53%.
Facebook already uses facial-recognition technology to suggest tags on photos uploaded by users; Google has similar technology for its Google+ social network. But DeepFace represents a big step forward.
Facial recognition technology is everywhere. It may not be legal.
Privacy advocates and representatives from companies like Facebook and Google are meeting in Washington on Thursday to try to set rules for how companies should use this powerful technology. They may be forgetting that a good deal of it could already be illegal.
There are no federal laws that specifically govern the use of facial recognition technology. But while few people know it, and even fewer are talking about it, both Illinois and Texas have laws against using such technology to identify people without their informed consent. That means that one out of every eight Americans currently has a legal right to biometric privacy.
Companies like Facebook and Google routinely collect facial recognition data from their users, too. Their technology may be even more accurate than the government’s. Google’s FaceNet algorithm can identify faces with 99.63 percent accuracy. Facebook’s algorithm, DeepFace, gets a 97.25 percent rating.
How to turn off facial recognition on Facebook
Not in front of the telly: Warning over ‘listening’ TV
Samsung is warning customers about discussing personal information in front of their smart television set.
The warning applies to TV viewers who control their Samsung Smart TV using its voice activation feature.
When the feature is active, such TV sets “listen” to what is said and may share what they hear with Samsung or third parties, it said.
Privacy campaigners said the technology smacked of the telescreens, in George Orwell’s 1984, which spied on citizens.
The policy explains that the TV set will be listening to people in the same room to try to spot when commands or queries are issued via the remote. It goes on to say: “If your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party.”