Torrent Invites! Buy, Trade, Sell Or Find Free Invites, For EVERY Private Tracker! HDBits.org, BTN, PTP, MTV, Empornium, Orpheus, Bibliotik, RED, IPT, TL, PHD etc!



Results 1 to 2 of 2
Like Tree3Likes
  • 3 Post By kirill

Thread: YouTube, Facebook, & TikTok Won’t Discuss Bad Takedowns? Get Over It, They’re Busy

  1. #1
    EyeS Of TiGeRs
    kirill's Avatar
    Reputation Points
    1066315
    Reputation Power
    100
    Join Date
    Aug 2017
    Posts
    36,559
    Time Online
    646 d 16 h 51 m
    Avg. Time Online
    5 h 26 m
    Mentioned
    5602 Post(s)
    Quoted
    1042 Post(s)
    Liked
    15291 times
    Feedbacks
    983 (100%)

    YouTube, Facebook, & TikTok Won’t Discuss Bad Takedowns? Get Over It, They’re Busy

    Over the years we've published many articles detailing abusive content removal demands and more generally the staggering volume of takedown notices received by the likes of Google and YouTube. A common complaint by users of these services is the difficulty in finding a real person to discuss their issues when things go wrong. That's something unlikely to change anytime soon because content is being taken down like never before.


    Back in August we reported how Google had received requests to remove one billion allegedly-infringing links from its search results. A billion is a big number, especially when it refers to takedown demands received over a period of just nine months.

    A few days before we published that report, Google had just processed its seven billionth removal request, having reached six billion less than a year earlier. At the time of writing, just four months after reaching seven billion, Google has already processed another 572,000,000 takedown demands.

    And that’s only Google search. Content ID claims alone reached 1.5 billion on YouTube in 2021 and that doesn’t account for all the removals carried out by Facebook, Instagram, TikTok, X/Twitter, Snapchat and any other platform that springs to mind.


    The Situation is Bad and Getting Worse By the Day

    Under the Digital Services Act, large online platforms are required to keep the European Commission updated via so-called ‘statements of reasons’ which detail the circumstances behind the removal of every piece of content from their platforms. These reports are added to an EC database which is made available in the form of a continuously updated transparency report.

    For demonstration purposes we extracted all the reasons for removal cited by YouTube in one 24-hour period during the last week and found several related to copyright, including those detailed below.

    ⦿ Your video has been removed from YouTube for a Terms of Service violation because it is a copy of another video that was previously removed from YouTube due to a copyright removal request that we received.

    ⦿ Content that shows viewers how to gain unauthorized free access to audio content, audiovisual content, full video games, software, or streaming services that normally require payment is not allowed.

    ⦿ Due to multiple copyright strikes associated with the videos below, your YouTube channel has now been terminated. Copyright owners can choose to issue legal complaints that require YouTube to take down videos that contain their content. When you have 3 or more copyright strikes, your channel can be terminated.

    Other reasons for content deletion unrelated to copyright, and in some cases seemingly more complicated to determine via automated means, were in abundant supply. Those listed below represent just a small sample.



    Social media platform Facebook also reports huge numbers of takedowns to the EC. On the handful of days we extracted the company’s reports, data protection and privacy violations were very common, along with ‘scams and fraud’, ‘illegal or harmful speech’, and ‘pornography or sexualized content’, the latter often labeled ‘synthetic media’.


    Reasons For Removal Vary But All Platforms Are Staggeringly Busy

    Depending on the nature of the platform, the reasons for removing content can vary considerably. On the days we took samples, which may not necessarily be representative in a broader analysis, Amazon removed huge numbers of listings for copyright and trademark infringement, violations of electrical/packaging standards, fakes and scams, and general advertising policy violations. Overall, few if any violations were of a personal nature, however.

    TikTok, on the other hand, appears to spend a worrying amount of time removing content categorized as ‘Violent Behaviors and Criminal Activities’, ‘Harassment and Bullying’, ‘Hate Speech and Hateful Behaviors’, ‘Sexually Suggestive Content’, ‘Sexual Exploitation and Gender-Based Violence’, ‘Suicide and Self-Harm’ and well, you get the idea. What motivates users to act in this manner is best left to mental health specialists, but it seems that without TikTok’s constant moderation, the platform might be completely uninhabitable.

    That brings us back to the almost inevitable conclusion that at some point, few if any major platforms will have the resources to deal with abusive takedowns on an individual, human-powered basis, on any meaningful scale. The EU’s DSA ‘Statements of Reasons’ database shows why individual attention is likely to become even more scarce as major platforms deal with a seemingly endless tsunami of takedowns based on a growing list of alleged violations.



    When combined, YouTube, Facebook, TikTok, Google Play, Apple’s App Store and Amazon reported 25,847,600 takedowns for just one week, each with a statement explaining why the content was removed. But that’s only the beginning.

    To provide the full picture we would need to add AliExpress, Booking.com, Google Maps, Google Shopping, Instagram, LinkedIn, Pintrest, Snapchat, X/Twitter, and Zalando to the above.

    The numbers are big: 11,679,101 statements of reasons were added to the system on December 5 and another 15,519,304 on December 6. During the last week the smallest number of statements filed in a single day was 9,828,619. The image below shows the overall position as of this morning.



    Those curious to see for themselves can grab daily .csv files weighing in at 5GB/6GB each and containing nothing but text.

    After attempting to review just one of these files, it’s clear why YouTube struggles with disputes that can’t be handled by automation. AI will at some point provide something close to acceptable but until our artificial overlords can provide a credible fair use assessment or recognize when anti-piracy outfits are using crude word-based filters, copyright frustrations will continue as normal.

  2. #2
    Power User VitaminD3's Avatar
    Reputation Points
    793
    Reputation Power
    14
    Join Date
    Dec 2023
    Posts
    107
    Time Online
    5 d 15 h 28 m
    Avg. Time Online
    15 m
    Mentioned
    8 Post(s)
    Quoted
    19 Post(s)
    Liked
    46 times
    Feedbacks
    6 (100%)
    This data is incomplete without the total numbers of videos added for the same duration, would like to see if the percentage is going up or steady over the years. my 2c.

    Rant on:
    It's really bad that nothing can be done against these companies at a gov or international org level either ! All the hard work of content creators gets paid less and less per view every year, so they're driven into working more and more, with a massive toll on their mental and physical health. or to increase the number of ads, that leads to subscriber loss. I hope fairer platforms like Nebula bite into Google's profits and force their management to think it through- but that's up to us to support content creators on alternative platforms.
    These "takedowns" make it really hard for independent journalists as well, one can't say anything bad (but true) about a bigger entity, things like corruption or business fraud investigations, without risking your channel's future.
    Hopefully, the solution will come from the EU (doesn't look like the US cares about a fair world these days), start with large fines until Google&co provide a (EU regulated) standard detailed information on what triggered the actions (links, timestamps, if automatic or manual) instead of generic reasons. Also, next step would be to provide an easier way to appeal automated takedowns, and then to get the community involved in "policing" the platforms (as volunteers - hey, it works for Reddit, right? or as paid help). A good step forward to provide this data in the public domain.


LinkBacks (?)

  1. 12-07-2023, 09:37 PM

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •