news


UK AI watchdog reckons social media firms should be more transparent

The Centre for Data Ethics and Innovation says there is strong public support for greater regulation of online platforms, but then it would.

It knows this because it got IPSOS Mori to survey a couple of thousand Brits in the middle of last year and ask them how much they trust a bunch of digital organisations to personalise what they deliver and to target advertising in a responsible way. You can see the responses in the table below, which err towards distrust but not by a massive margin. The don’t know’s probably provide an indication of market penetration.

How much trust, if any, do you have in each of the following organisations to personalise the content users see and to target them with advertising in a responsible way?
Facebook YouTube Instagram TikTok Twitter Snapchat Amazon LinkedIn BBC iPlayer Google search or Maps
A great deal of trust 7% 10% 6% 4% 6% 5% 13% 7% 16% 13%
A fair amount of trust 24% 38% 22% 8% 22% 15% 43% 25% 45% 44%
Not very much trust 30% 26% 24% 15% 25% 22% 24% 18% 17% 23%
No trust at all 32% 16% 24% 28% 25% 26% 13% 20% 10% 13%
Don’t know 8% 10% 23% 45% 23% 32% 7% 30% 11% 7%

It seems that UK punters haven’t generally got a problem with online profiling and consequent ad targeting, but are concerned about the lack of accountability and consumer protection from the significant influence this power confers. 61% of people favoured greater regulatory oversight of online targeting, which again is hardly a landslide and not the most compelling datapoint on which to base public policy.

“Most people do not want targeting stopped, but they do want to know that it is being done safely and responsibly and they want more control.” said Roger Taylor, Chair of the CDEI. “Tech platforms’ ability to decide what information people see puts them in a position of real power. To build public trust over the long-term it is vital for the Government to ensure that the new online harms regulator looks at how platforms recommend content, establishing robust processes to protect vulnerable people.”

Ah, the rallying cry for authoritarians everywhere: ‘think of the vulnerable!’ Among those, it seems, are teenagers, who are notorious for their digital naivety. “We completely agree that there needs to be greater accountability, transparency and control in the online world,” said Dr Bernadka Dubicka, Chair of the Child and Adolescent Faculty at the Royal College of Psychiatrists. “It is fantastic to see the Centre for Data Ethics and Innovation join our call for the regulator to be able to compel social media companies to give independent researchers secure access to their data.”

The CDEI was created last year to keep an eye on AI and technology in general, with a stated aim of investigating potential bias in algorithmic decision making. This is the first thing it has done in that intervening year and it amounts to a generic bureaucratic recommendation it could have made on day one. Still, Rome wasn’t built in a day and it did at least pad that out into a 120-page report.

  • 2020 Vision Executive Summit


2 comments

  1. Avatar Bernard Parker 05/02/2020 @ 1:03 pm

    Not a good example of unbiased reporting

    • Scott Bicheno Scott Bicheno 05/02/2020 @ 1:08 pm

      Not a good example of useful commenting.

Leave a Reply to Bernard Parker Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Polls

Is there any need for a high-street presence for the telcos nowadays?

Loading ... Loading ...