No-one reported the video of the Christchurch terror attack while it was being streamed live, Facebook has said.

It was 29 minutes after the video had started – and 12 minutes after it had ended – before the first user flagged up the footage, the social media giant said.

The company earlier revealed that it had removed 1.5 million videos of the attack worldwide in the 24 hours after the shootings, 1.2 million of which were blocked at upload.

New Zealand Mosque Shooting
People mourn at a makeshift memorial site near the Al Noor mosque in Christchurch (Vincent Thian/AP)

Facebook and other social media firms have come under fire over the rapid spread of the footage across the networks and around the world.

In a blog post on Tuesday, Chris Sonderby, vice president and deputy general counsel at Facebook, said the video was viewed fewer than 200 times during its live broadcast.

“No users reported the video during the live broadcast,” he added.

“Including the views during the live broadcast, the video was viewed about 4,000 times in total before being removed from Facebook.

“The first user report on the original video came in 29 minutes after the video started, and 12 minutes after the live broadcast ended.

New Zealand Mosque Shooting
Muslims offer prayers near the main road to the Al Noor mosque in Christchurch (Vincent Thian/AP)

“Before we were alerted to the video, a user on 8chan posted a link to a copy of the video on a file-sharing site.”

Mr Sonderby said Facebook was “working around the clock” to prevent the video from appearing on its site.

Meanwhile, the Global Internet Forum to Counter Terrorism, formed by tech giants Facebook, Microsoft, Twitter, and YouTube in 2017 to tackle the spread of terrorism online, said more than 800 different versions of the video have been added to a shared database.

The group said the “digital fingerprints” of visually-distinct videos were included, in a bid to uncover and remove edited videos that aim to get around existing detection technology.

“This incident highlights the importance of industry co-operation regarding the range of terrorists and violent extremists operating online,” it said.

New Zealand Mosque Shootings
Fifty pairs of white shoes lined up as a memorial to the victims the Christchurch massacre (Mark Baker/AP)

New Zealand Prime Minister Jacinda Ardern has called on social media companies to take responsibility for ensuring that such content cannot be distributed or viewed on their platforms, saying they are “the publisher, not just the postman”.

She told the country’s parliament: “There is no question that ideas and language of division and hate have existed for decades, but their form of distribution, the tools of organisation, they are new.

“We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published.

“They are the publisher, not just the postman. There cannot be a case of all profit, no responsibility.”

New Zealand Mosque Shooting
A police official stands guard in front of the Al Noor mosque in Christchurch (Vincent Thian/AP)

In the UK, Home Secretary Sajid Javid told social media companies “enough is enough” in the wake of last Friday’s shootings.

Reacting to a tweet from YouTube claiming that the video-sharing service was working to remove the footage, he said: “You really need to do more @YouTube @Google @facebook @Twitter to stop violent extremism being promoted on your platforms. Take some ownership. Enough is enough.”

Damian Collins, Tory chairman of the Digital, Culture, Media and Sport Select Committee, called for a review into how the footage was shared and “why more effective action wasn’t taken to remove them”.

And Downing Street said social media companies needed to act “more quickly” to remove terrorist content.