Skip to content

Latest commit

 

History

History

WacOS_Cloud

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 

WacOS Cloud

Info taken from

WacOS/!Cloud

About

I cannot support iCloud for this project due to a bad uncertain future. You can read more about the problems with iCloud below.

Instead, the following cloud services are planned to be supported:

Degoo

pCloud

ProtonDrive

!Cloud is an unofficial pun of iCloud that fits this project, although the C is not reversed. This is just coincidental, but I like that it worked out this way. I just wanted the cloud options to appear at the top of the repository.

The WacOS Cloud service set is written in Python.

Problems with iCLoud

Apple iCloud August 2021 controversy

This section needs to be rewritten for neutrality and professionalism.

THIS SECTION IS OUTDATED

Today, I learned of something stupid Apple did. They are now scanning all iCloud media for child abuse content. Now you are probably asking, how is that bad? Don’t you hate child abuse? Yes I do, but there is a massive flaw with this: algorithms aren’t that good yet. When I was using OneDrive, it had such poor recognition of everyday objects that it was incredibly laughable. Applying this filter is like using Google Translate to pass a PhD College exam in Esperanto, It isn’t accurate enough. I think that Apple should still try this, but they shouldn’t pull a Google/YouTube and leave everything to the machine, there needs to be people checking the incidents of child abuse to make sure it isn’t someone doing something innocent like recording videos of a fish that the algorithm may think is of someone dropkicking a child down the stairs, as the algorithm really can be that bad. Imagine taking a picture of a burrito and uploading it to iCloud, then having the FBI come and arrest you and take away your children, because the algorithm was that bad. Algorithms are easily fooled. YouTube is a very primary example of this.

The problem: leaving it to the machines

The solution: have a real person or a team of people (if a single person can't be trusted) verify each claim of child abuse to make sure the algorithm isn't sending an innocent person to prison

This section was last updated on August 5th 2021 at 11:29 pm

DETADTUO SI NOITCES SIHT

I am keeping this section until I can document it better, the situation is worse than expected. I wasn't aware enough of this, and I know that this is a bad thing, and that Apple is falling on its small privacy platform, which was never really very legitimate due to the majority of Apple software being proprietary. This is yet another example of the "Think of the Children" argument, where emotions are evoked as children are used as scapegoats, so that Apple can take away user freedom without much setback. This strategy has unfortunately worked many times ever since the September 11th 2001 attacks, but luckily, society seems to be figuring this out, as it wasn't received well by the privacy community and people outside of the privacy community. Facebook even had the audacity to make an announcement that they oppose this, while good, they are in absolutely no position to say anything, and they are trying to look like the good guys by taking advantage of a weakened Apple.

This is Apple performing what is called damage control they know that if they were just caught doing this, it would create a much bigger controversy. Because they did it like this, it created a controversy, but unfortunately, people are likely to forget about it and move on.

... But not everyone will forget and move on...

This will live on in their history, and time will not heal this wound. Apple is completely rotten now, and needs to be thrown in the trash.

This section was last updated on August 12th 2021 at 10:06 pm

iCloud 2014 celebrity photo leak

From Wikipedia

On August 31, 2014, a collection of almost 500 private pictures of various celebrities, mostly women, and with many containing nudity, were posted on the imageboard 4chan, and later disseminated by other users on websites and social networks such as Imgur and Reddit. The images were initially believed to have been obtained via a breach of Apple's cloud services suite iCloud, or a security issue in the iCloud API which allowed them to make unlimited attempts at guessing victims' passwords. Apple claimed in a press release that access was gained via spear phishing attacks.

The incident, which media outlets and Internet users referred to under names such as "The Fappening" (a play on "fap", a slang term for masturbation, and The Happening) and "Celebgate", was met with a varied reaction from the media and fellow celebrities. Critics felt the distribution of the images was a major invasion of privacy for their subjects, while some of the allegedly depicted subjects denied their authenticity. The leak also prompted increased concern from analysts surrounding the privacy and security of cloud computing services such as iCloud—with a particular emphasis on their use to store sensitive, private information.

Other iCloud security and privacy problems

Request a controvery to list here