-
Posts
7,282 -
Joined
-
Last visited
-
Days Won
2,416
Content Type
Forums
Blogs
Events
Resources
Downloads
Gallery
Store
Everything posted by allheart55 Cindy E
-
-
It's crazy, I make disks for my people all of the time. This guy wasn't hurting anyone, especially Microsoft.
- 5 replies
-
- e-waste
- microsoft restore disks
-
(and 1 more)
Tagged with:
-
Eric Lundgren, an e-waste recycler and inventor, will have to serve a 15-month prison sentence for selling Microsoft restore disks. Lundgren received his sentence months ago, but a federal appeals court confirmed it earlier this month. He’ll also have to pay a $50,000 fine. Although restore disks are given to everyone who buys a computer with a licensed version of Windows (and can be downloaded for free), Microsoft decided to press criminal charges against Lundgren for distributing the disks, which he did to help people keep their computers running longer. Microsoft argued that this free-to-download software was worth $25 per disk, which the court accepted. Lundgren had 28,000 disks made and shipped to a broker in Florida in an effort to sell them to refurbishing shops for 25 cents each. These shops then wouldn't have to make them, and users who don’t necessarily know they can go online to download the software could preserve their computer without needing to buy an entirely new one. The appeals court says Lundgren’s infringement of Microsoft products cost up to $700,000. Microsoft issued this statement to The Verge on the ruling: “Microsoft actively supports efforts to address e-waste and has worked with responsible e-recyclers to recycle more than 11 million kilograms of e-waste since 2006. Unlike most e-recyclers, Mr. Lundgren sought out counterfeit software which he disguised as legitimate and sold to other refurbishers. This counterfeit software exposes people who purchase recycled PCs to malware and other forms of cybercrime, which puts their security at risk and ultimately hurts the market for recycled products.” Nathan Proctor, director of US PIRG’s Right to Repair campaign, issued a statement over Lundgren’s sentencing: “Companies have gotten too aggressive in pushing us to throw things away and buy new things. What we should be doing instead is reusing more, repairing more, and recycling the rest — ideas that Eric Lundgren has been pioneering.” The Right to Repair has been hotly debated in recent months, particularly because California proposed a law that would require electronics manufacturers to make repair information and parts available to product owners and to third-party repair shops and services. Seventeen other states have proposed similar legislation. Most major tech companies, including Apple and Microsoft, are opposed to the idea of letting users fix their own devices on the grounds that it poses a security risk to users, which we can see in Microsoft’s above statement. Although as Lundgren’s case demonstrates, the companies are likely more concerned over a loss in profit than anything else. Update 4/25, 12:31 PM ET: Updated to include Microsoft’s comment. Source: The Verge
- 5 replies
-
- e-waste
- microsoft restore disks
-
(and 1 more)
Tagged with:
-
-
-
-
Perpetually
-
-
-
-
-
A report accusing large numbers of child-centred Android apps of potentially breaking US law? It’s the sort of finding that even a company of Google’s almost unassailable power can’t ignore. The trouble started a week ago when International Computer Science Institute researchers published Won’t somebody think of the children? Examining COPPA Compliance at Scale, a reference to the Children’s Online Privacy Protection Act of 1998 which protects under-13s. After analysing 5,855 Android apps that claim to comply with the Google Play Store’s Designed for Families (DFF) program, researchers found what’s best described as a privacy and surveillance mess. 40% were transmitting personal information “without applying reasonable security measures” (SSL/TLS encryption), while another 18.8% were sharing data with third parties that could be used to identify children and their devices for profiling. Almost one in twenty were sharing personal data, such as email addresses and social media profiles, with third parties without consent. The long and short of this: Overall, roughly 57% of the 5,855 child-directed apps that we analyzed are potentially violating COPPA. The underlying problem appears to be the Wild West of third-party software development kits (SDK) which have privacy-protecting settings turned off or ignored – even, in some cases, when the terms of service of SDKs prohibit such a thing in apps designed for children. [/url] It appears Google’s much-vaunted DFF program is big on promises but weak on the kind of enforcement that might hold app developers to account. Making the matter worse… Google already performs static and dynamic analysis on apps submitted to the Play Store, so it should not be hard for them to augment this analysis to detect non-compliant entities. Not to forget that it’s just over a year since Google threatened to remove apps that breach its general privacy terms and conditions. A few months ago, this report might have attracted a few headlines and then been submerged by a tide of new stories and quickly forgotten. However, its publication only weeks after Facebook found itself hauled up for its privacy design, means that’s unlikely to be the case. It’s not as if this is the first bunch of apps researchers have found problems with in terms of privacy and security and yet, unusually, Google felt compelled to issue a holding statement: Protecting kids and families is a top priority, and our Designed for Families program requires developers to abide by specific requirements above and beyond our standard Google Play policies. We’re taking the researchers’ report very seriously and looking into their findings. If we determine that an app violates our policies, we will take action. Google, then, is going to look into the issue of app compliance with DFF and perhaps how this affects COPPA too. The problem with this response is that it all sounds a bit like Facebook’s way of dealing with years of privacy complaints – kick the problem down the road but leave the model that caused it – self-regulation – untouched. Source: Sophos
-
- android apps
- children
-
(and 2 more)
Tagged with:
-
-
It should have been an easy question to answer. It came from Florida Rep. Kathy Castor during the House’s questioning of Facebook CEO Mark Zuckerberg last week, when she asked: You are collecting personal data on people who are not Facebook users. Yes or no? There was no yes or no to be had, so she tried again: You watch where we go. Isn’t that correct? Zuckerberg’s response: Everyone has control over how that works. She wasn’t the only member of the House Energy and Commerce Committee to press the CEO about how much information it collects about both users and non-users. As Castor put it, “It’s practically impossible these days to remain untracked in America,” and it’s led to a “devil’s bargain” in which people are “spied on” and tracked even after they leave the platform. On Monday, Facebook finally coughed up the answer. It’s no shocker: the answer is yes. Yes, Facebook tracks both users and non-users across websites and apps, according to a post written by David Baser, Product Management Director. It does so for three main reasons, he said: To provide its services to the sites or apps; To improve safety and security on Facebook; and To enhance its own products and services. From the post: When you visit a site or app that uses our services, we receive information even if you’re logged out or don’t have a Facebook account. This is because other apps and sites don’t know who is using Facebook. Facebook is far from the only online service to do this. Twitter, Pinterest and LinkedIn have similar Like and Share buttons, Google has a popular analytics service, and Amazon, Google and Twitter all offer login features, Baser said. In fact, most websites and apps send the same information to multiple companies each time you visit them. Baser emphasized that “We don’t sell people’s data. Period.” And, just as Zuckerberg repeatedly told Senators and Representatives last week, Baser said that Facebook is focused on putting users in control of their data and that the company is trying to be more transparent about the data it collects and how that data is used. Whether it’s information from apps and websites, or information you share with other people on Facebook, we want to put you in control – and be transparent about what information Facebook has and how it is used. We’ll keep working to make that easier. That transparency doesn’t extend to letting non-users get at the data Facebook collects about them, however. On Wednesday, Zuckerberg responded to questions from Rep. Ben Luján by explaining that Facebook collects “data of people who have not signed up for Facebook” for “security purposes,” explaining how it helps to prevent scraping: …in general we collect data on people who have not signed up for Facebook for security purposes to prevent the kind of scraping you were just referring to … we need to know when someone is repeatedly trying to access our services The CEO didn’t explain what, if anything, else Facebook might doing with the data it gathers on non-members. Lawmakers and privacy advocates immediately responded, with many saying that Facebook needed to develop a way for non-users to find out what the company knows about them. On Friday, Facebook said it had no plans to build such a tool, according to Reuters. In his post on Monday, Baser added a bit of detail around the security purposes behind its collection of non-users’ data: If someone tries to log into your account using an IP address from a different country, we might ask some questions to verify it’s you. Or if a browser has visited hundreds of sites in the last five minutes, that’s a sign the device might be a bot. Baser explained that one of the services Facebook provides to websites and apps is Audience Network: a service that lets advertisers create ads on Facebook that show up elsewhere in cyberspace. Advertisers can also target non-users with a tiny but powerful snippet of code known as the Facebook Pixel: a web targeting system embedded on many third-party sites. Facebook has lauded it as a clever way to serve targeted ads to people, including non-members. Conspicuous by its absence from the blog post was any mention of shadow profiles: profiles of people who’ve never signed up for Facebook. European countries have been battling with Facebook over shadow profiles for years. In 2011, a Irish privacy group sent a complaint about shadow profiling – collecting data including but not limited to email addresses, names, telephone numbers, addresses and work information – from non-members. More recently, in the latest installment in a long-running privacy case, a Belgian court ordered Facebook to stop profiling non-members in the country or face a daily fine. But what, exactly, can non-users do about this tracking? Facebook sent this statement to Reuters: This kind of data collection is fundamental to how the internet works. There are basic things you can do to limit the use of this information for advertising, like using browser or device settings to delete cookies. This would apply to other services beyond Facebook because, as mentioned, it is standard to how the internet works. Source: Sophos
-
-
-
-
I think the new skins are great.
-
-
Sounds good! Thanks for letting us know, Bob.
-
-
Google’s effort to rein in the so-called right to be forgotten has taken a hit after U.K. and French judges said the reputation of businessmen tarnished by old news stories about improper conduct trumped the public’s need to know. Google was told by a London judge on Friday for the first time to remove links to older stories about a businessman’s criminal conviction from search results. The decision follows last month’s ruling by a French court about a former chief financial officer whose job prospects were hampered by stories about a fine for civil insider-trading violations. The European rulings are starting to stack up, hurting Google’s ability to fight future court cases over how far it should go to delete links. It comes amid a storm of European criticism over U.S. technology companies’ ability to protect user privacy. Facebook Inc. is embroiled in a scandal over revelations that the data of tens of millions of people was improperly shared with a political consulting firm. "It means even more work for Google," said Jon Baines, the data protection adviser at London law firm Mishcon de Reya. "They will have to give even more intense scrutiny to the facts of an application for delisting. In the U.K. ruling, Judge Mark Warby said 11 articles related to the businessman should be delisted by Google. Crime and Punishment “The crime and punishment information has become out of date, irrelevant and of no sufficient legitimate interest to users,” Warby said. A second businessman failed in his bid to have links to articles taken down about a more serious crime. The Alphabet Inc. unit must remove information about a person on request if it’s outdated or irrelevant under a 2014 European Union top court ruling. The original EU court ruling, however, failed to outline clear terms for when the search engine should remove information and the U.K. decision may be the first from a major court that effectively ruled that criminal conduct can be erased. “We work hard to comply with the Right to be Forgotten, but we take great care not to remove search results that are in the public interest and will defend the public’s right to access lawful information,” Mountain View, California-based Google said in a statement. No Damages The court refused to award damages to the businessman, saying Google took reasonable care in the case. The two men, who can’t be identified because of a court order, had asked that links to information on their old convictions be taken down. Under English law designed to rehabilitate offenders, those convictions don’t have to be disclosed to potential employers and can effectively be ignored. The businessman, known as NT2, was imprisoned for six months in the early part of this century after authorizing an investigations firm to conduct computer hacking and phone tapping to find out who was engaged in hostile activity against his company. "His past offending is of little if any relevance," Justice Warby wrote. "There is no real need for anybody to be warned about that." The judge considered the businessmen’s current conduct as part of his ruling. “The situation as it is now can potentially influence whether the information is still relevant," Baines at Mishcon de Reya said. But courts throughout Europe are likely to interpret the right to be forgotten in different ways. French Ruling In last month’s French ruling, Paris judges said that Google had to reduce the visibility of stories about a former chief financial officer at a French company who was fined 200,000 euros ($247,000) for civil insider-trading violations. The judges said the right to privacy should prevail after laying down a series of benchmarks including impact on work and family life that dictated whether results should be easily available. The ruling pointed out that this father-of-four didn’t profit financially from the violations and was at risk of losing his job again unless the articles were brought down in search results with his name. “Given his family situation, the loss of his job would cause him a very serious prejudice, especially given that it took him nearly two years to find a new job,” the judges said. In those circumstances, “the public interest in having information with his name about this case doesn’t prevail.” Jonathan Coad, a media lawyer at Keystone Law in London, said there’s an inherent problem with the degree of latitude each EU country will have in implementing the right to be forgotten. “We’re all supposed to be complying with” the same privacy laws, Coad said. “But French privacy laws are much more draconian than U.K. ones." The two U.K. cases are: NT1 v. Google and NT2 v. Google, High Court of Justice, Queen’s Bench Division, Case No.’s HQ15X04128 and HQ15X04127. Source: IT Pro Today