Hello and welcome to Screenshot, your weekly tech update from national technology reporter Ange Lavoipierre, featuring the best, worst and strangest in tech and online news. Read to the end for an A+ Subreddit recommendation you didn’t know you needed.
As the next federal election draws closer and the deepfakes keep getting better, you should know there are no actual laws against using deepfakes in a political campaign.
The Australian Electoral Commissioner, Tom Rogers, has confirmed as much at parliament’s AI inquiry this week, saying: “It’s lawful … and so whether or not that law changes, it’s a matter for parliament, not for me.”
Loading…
To spell it out, if Labor wanted to release a lifelike video of Peter Dutton explaining why he’s a vegan, or if the Coalition posted an equally convincing video of Anthony Albanese announcing a ban on dog ownership, there would be no obvious law impeding them.
I’m not saying either party would do that. I’m also not speculating on what a candidate such as Clive Palmer might consider above or below the belt during a political campaign.
“It’s one thing to see an unethical player lying about their political opponent. It’s quite another thing to … generate the lies being told out of the mouth of your political opponent,” Greens senator David Shoebridge said.
“The government has, I think, dropped the ball on the use of generative AI,” added independent senator David Pocock.
The commissioner was enthusiastic about the idea of new laws, but not prohibiting AI in campaigns altogether.
“A blanket ban on AI-related material [in elections] would be … very impractical, but we would like to see some changes … particularly some legislative changes,” Mr Rogers said.
In particular, he says mandatory watermarking could help, but argued it should not fall to the AEC to enforce.
“We will do whatever parliament asks us to do … but there is a real risk of ruining the AEC’s neutrality if that is put upon us,” he told the inquiry.
“As I stated in other forums, the AEC does not possess the legislative tools or the internal technical capability to deter, detect or adequately deal with false AI-generated content concerning the electoral process.”
For what it’s worth, South Korea amended its election law in 2023 to ban election-related deepfake videos, photos, and audio in the 90 days before the vote, and it seemed to help.
The Australian government has been discussing new deepfake laws, but so far only in the context of the non-consensual, explicit variety.
It’s official, the age verification trial will be about more than just porn — it’s also going to take in social media and gaming, meaning the majority of the Australian population could be asked to prove their age online.
There’s no guarantee all that’s going into a scheme, but including extra use-cases in the trial is the first step down that path, and perhaps a statement of intent.
Given the extended scope, we’ve gone deep on how age verification tech is currently working and what it might be like to use.
That said, the plan is still light on detail. This week a Screenshot reader got in touch to ask what the go was for gaming, and honestly, we have no idea. The government is promising to explain more soon and we promise to tell you when it does.
As if it needed to be said again, being a teenager in 2024 is sometimes a very specific hell that’s obscure to anyone over the age of 30.
Take for example being doxxed at the age of 13, which is just such a far cry from my nightly adolescent turf wars at the same age over who would get to use the sole family PC.
Attorney-General Mark Dreyfus says doxxing laws are on the way, but we’re not expecting to see a draft until August.
While AI being “in” a computer might sound as obvious as blue being “in” the sky, this is actually one of those things that is a Big Deal™.
AI models are normally either downloaded or used online, but Microsoft has just announced an “AI computer”, meaning the technology is in-built.
It’s the company’s latest play in the overheated race to see which tech giant can get the most AI into the most places, fastest.
What does it mean? Hard to say! In case you haven’t worked it out yet, this is all one big live experiment, and we’re the rats.
Perhaps there’s some comfort in knowing we’ll all find out together.
Loading
Speaking of the big experiment, life has been imitating art for some time now when it comes to AI, but the resemblance was too close for ChatGPT this week.
OpenAI chose to pull “Sky” from the voices of its fancy new talking language model, due to its uncanny resemblance to Scarlett Johansson’s performance in the movie Her.
Johansson has since revealed to NPR that OpenAI tried and failed to convince her to be the voice of Sky, and says she’s “shocked, angered and in disbelief” having now heard the final product.
OpenAI says a different actor voiced Sky, but they’re “pausing” it for now anyway.
The company says voice mode (which can seemingly handle sarcasm, singing and interruptions) will be made available to premium users in the coming weeks.
Then take some joy in knowing that owl-lovers beat one of the biggest sporting franchises in the world to the punch in securing r/Superbowl on Reddit for their own purposes.
Pronounced “superb owl”, this forum is every bit as finicky and pure-hearted as you might dare to hope.
Please note: posts from owl cafes are strictly prohibited.
Recommendations and tips are always welcome. You can reach me securely via Proton Mail.
Posted , updated