Do any jurisdictions seriously consider reclassifying social media websites as publishers?












10















The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.



Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.



Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.



Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.



Q. Are any jurisdictions seriously considering relaxing this kind of provision?










share|improve this question




















  • 14





    @Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

    – Jeff Lambert
    Apr 18 at 12:55






  • 4





    @Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

    – hszmv
    Apr 18 at 13:32






  • 1





    Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

    – pjc50
    Apr 18 at 14:31






  • 1





    This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

    – reirab
    Apr 18 at 20:56






  • 1





    A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

    – reirab
    Apr 18 at 20:57
















10















The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.



Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.



Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.



Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.



Q. Are any jurisdictions seriously considering relaxing this kind of provision?










share|improve this question




















  • 14





    @Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

    – Jeff Lambert
    Apr 18 at 12:55






  • 4





    @Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

    – hszmv
    Apr 18 at 13:32






  • 1





    Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

    – pjc50
    Apr 18 at 14:31






  • 1





    This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

    – reirab
    Apr 18 at 20:56






  • 1





    A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

    – reirab
    Apr 18 at 20:57














10












10








10








The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.



Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.



Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.



Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.



Q. Are any jurisdictions seriously considering relaxing this kind of provision?










share|improve this question
















The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.



Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.



Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.



Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.



Q. Are any jurisdictions seriously considering relaxing this kind of provision?







regulation social-media fake-news






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 18 at 13:12







Mozibur Ullah

















asked Apr 18 at 12:27









Mozibur UllahMozibur Ullah

2,029820




2,029820








  • 14





    @Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

    – Jeff Lambert
    Apr 18 at 12:55






  • 4





    @Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

    – hszmv
    Apr 18 at 13:32






  • 1





    Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

    – pjc50
    Apr 18 at 14:31






  • 1





    This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

    – reirab
    Apr 18 at 20:56






  • 1





    A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

    – reirab
    Apr 18 at 20:57














  • 14





    @Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

    – Jeff Lambert
    Apr 18 at 12:55






  • 4





    @Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

    – hszmv
    Apr 18 at 13:32






  • 1





    Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

    – pjc50
    Apr 18 at 14:31






  • 1





    This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

    – reirab
    Apr 18 at 20:56






  • 1





    A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

    – reirab
    Apr 18 at 20:57








14




14





@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

– Jeff Lambert
Apr 18 at 12:55





@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

– Jeff Lambert
Apr 18 at 12:55




4




4





@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

– hszmv
Apr 18 at 13:32





@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

– hszmv
Apr 18 at 13:32




1




1





Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

– pjc50
Apr 18 at 14:31





Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

– pjc50
Apr 18 at 14:31




1




1





This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

– reirab
Apr 18 at 20:56





This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

– reirab
Apr 18 at 20:56




1




1





A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

– reirab
Apr 18 at 20:57





A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

– reirab
Apr 18 at 20:57










2 Answers
2






active

oldest

votes


















10














Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

[...]

Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."






share|improve this answer

































    7














    Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




    Lords debates online news and content publishers



    Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



    This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



    The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




    In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




    Whilst the case for change is clear, we also recognise that applying publisher standards of
    liability to all online platforms could risk real damage to the digital economy, which would be
    to the detriment of the public who benefit from them. That is why we are working with our
    European and international partners, as well as the businesses themselves, to understand
    how we can make the existing frameworks and definitions work better, and what a liability
    regime of the future should look like. This will play an important role in helping to protect
    users from illegal content online and will supplement our Strategy







    share|improve this answer


























      Your Answer








      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "475"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fpolitics.stackexchange.com%2fquestions%2f40722%2fdo-any-jurisdictions-seriously-consider-reclassifying-social-media-websites-as-p%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      10














      Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



      The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




      If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




      Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




      Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

      [...]

      Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




      It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



      Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




      Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


      The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




      It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."






      share|improve this answer






























        10














        Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



        The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




        If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




        Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




        Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

        [...]

        Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




        It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



        Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




        Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


        The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




        It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."






        share|improve this answer




























          10












          10








          10







          Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



          The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




          If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




          Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




          Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

          [...]

          Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




          It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



          Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




          Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


          The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




          It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."






          share|improve this answer















          Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



          The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




          If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




          Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




          Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

          [...]

          Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




          It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



          Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




          Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


          The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




          It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Apr 18 at 13:27

























          answered Apr 18 at 13:18









          Jeff LambertJeff Lambert

          10.6k52951




          10.6k52951























              7














              Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




              Lords debates online news and content publishers



              Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



              This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



              The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




              In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




              Whilst the case for change is clear, we also recognise that applying publisher standards of
              liability to all online platforms could risk real damage to the digital economy, which would be
              to the detriment of the public who benefit from them. That is why we are working with our
              European and international partners, as well as the businesses themselves, to understand
              how we can make the existing frameworks and definitions work better, and what a liability
              regime of the future should look like. This will play an important role in helping to protect
              users from illegal content online and will supplement our Strategy







              share|improve this answer






























                7














                Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




                Lords debates online news and content publishers



                Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



                This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



                The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




                In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




                Whilst the case for change is clear, we also recognise that applying publisher standards of
                liability to all online platforms could risk real damage to the digital economy, which would be
                to the detriment of the public who benefit from them. That is why we are working with our
                European and international partners, as well as the businesses themselves, to understand
                how we can make the existing frameworks and definitions work better, and what a liability
                regime of the future should look like. This will play an important role in helping to protect
                users from illegal content online and will supplement our Strategy







                share|improve this answer




























                  7












                  7








                  7







                  Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




                  Lords debates online news and content publishers



                  Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



                  This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



                  The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




                  In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




                  Whilst the case for change is clear, we also recognise that applying publisher standards of
                  liability to all online platforms could risk real damage to the digital economy, which would be
                  to the detriment of the public who benefit from them. That is why we are working with our
                  European and international partners, as well as the businesses themselves, to understand
                  how we can make the existing frameworks and definitions work better, and what a liability
                  regime of the future should look like. This will play an important role in helping to protect
                  users from illegal content online and will supplement our Strategy







                  share|improve this answer















                  Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




                  Lords debates online news and content publishers



                  Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



                  This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



                  The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




                  In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




                  Whilst the case for change is clear, we also recognise that applying publisher standards of
                  liability to all online platforms could risk real damage to the digital economy, which would be
                  to the detriment of the public who benefit from them. That is why we are working with our
                  European and international partners, as well as the businesses themselves, to understand
                  how we can make the existing frameworks and definitions work better, and what a liability
                  regime of the future should look like. This will play an important role in helping to protect
                  users from illegal content online and will supplement our Strategy








                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Apr 18 at 13:23

























                  answered Apr 18 at 13:12









                  JJJJJJ

                  8,03732966




                  8,03732966






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Politics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fpolitics.stackexchange.com%2fquestions%2f40722%2fdo-any-jurisdictions-seriously-consider-reclassifying-social-media-websites-as-p%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      He _____ here since 1970 . Answer needed [closed]What does “since he was so high” mean?Meaning of “catch birds for”?How do I ensure “since” takes the meaning I want?“Who cares here” meaningWhat does “right round toward” mean?the time tense (had now been detected)What does the phrase “ring around the roses” mean here?Correct usage of “visited upon”Meaning of “foiled rail sabotage bid”It was the third time I had gone to Rome or It is the third time I had been to Rome

                      Bunad

                      Færeyskur hestur Heimild | Tengill | Tilvísanir | LeiðsagnarvalRossið - síða um færeyska hrossið á færeyskuGott ár hjá færeyska hestinum