Recent Photo of HackerNoon Founder & CEO David Smooke.
\ Tim Keary from Techopedia: Confidence in journalism is at an all time low, and artificial intelligence threatens to destabilize this further. Currently news publishers of all sizes are being forced to decide what role AI plays in the newsroom.
\ Will they accept AI-generated content and headlines, how much AI use needs to be disclosed to readers, and will AI-generated news will take traffic away from human-written journalism?
\ Techopedia reached out to David Smooke, founder and CEO of HackerNoon, a technology publisher with over 45,000 contributing writers and 4 million monthly readers, to find out how his organization is experimenting with AI, and to get his thoughts on how AI should be approached by journalists and news publishers.
\ The Q&A provides a brief look at how HackerNoon is experimenting with AI in its operations, what the acceptable limits of AI’s role in the newsroom should be, Smooke’s thoughts on the The New York Times vs OpenAI lawsuit, and the future of human-written journalism.
\ Smooke, from Colorado, founded HackerNoon in 2013, back when AI was still effectively confined to the dreams of Hollywood.
\ Comments and formatting have been edited slightly for brevity.
Tim Keary: As a journalist and the CEO of a news organization, what role do you think that AI should play in the newsroom?David Smooke: I’m more of a writer and a product manager than a journalist. HackerNoon publishes all types of technology blog posts, like op eds, tutorials, interviews, columns, research papers, and some journalism. We’re building a community driven content management system, and there are many places where AI can assist writers, readers, and editors, such as through brainstorming new ideas, fixing grammar, or finding your next relevant story.
\ Within our text editor, we have a custom ChatGPT layer for rewrites, a handful of image generation models, and leverage AI to generate summaries for the native character count per distribution channel. We use AI to make stories more accessible by making more versions of the story; for example we use Google AI to translate stories into foreign languages and generate audio versions of the blog post.
\ As a consumer of news, when it comes to the newsroom specifically I would like journalists to research their stories with whatever the most advanced and relevant search technology or specific methodology that story calls for, but to never fully trust the AI, and always always verify.
\ As a consumer of news, when it comes to the newsroom specifically I would like journalists to research their stories with whatever the most advanced and relevant search technology or specific methodology that story calls for, but to never fully trust the AI, and always always verify.
More specifically, what level of use is acceptable in your view, and what level of transparency is required?It’s not acceptable for content to be presented as human made when it was made by AI. Platforms should do what they can to indicate where and how AI contributed to the experience. For example, we use emoji credibility indicators to indicate to the reader if AI assisted in the writing of the story. People on the internet should trust the author is who the site says the author is.
Do you see AI-generated news sites as a threat to the future of human-written journalism?There are side effects from the mass production and mass consumption of AI generated content. Deep fakes create billions of views across social media. Platforms are getting better at detecting them and labeling them, but they are just easier than ever to make.
\ When TUAW recently relaunched with newly AI generated content and accredited them to real humans that used to write there - it did not go over well. Writers don’t want that misattribution, and for blog posts, readers trust the content more if they trust the human on the other side of the screen.
\ Many financial websites and tools have been using natural language processing and automation to dish out headlines in seconds because that information is important for investors. It's a speed and convenience thing vs. slower human input, but it's been going on for far longer than the generative AI boom we're currently seeing.
\ Many financial websites and tools have been using natural language processing and automation to dish out headlines in seconds because that information is important for investors. It's a speed and convenience thing vs. slower human input, but it's been going on for far longer than the generative AI boom we're currently seeing.
I.e., is there a risk that AI-generated content will take clicks, attention, and money away from content that journalists have put time and effort into developing even if it can’t replicate on the ground reporting?Yes, there is a risk of more attention moving from the publisher to search experience. If a Google generated AI search results solves the problem today that a page on someone’s site would have solved yesterday, that is a lost visitor. On the plus side for publishers, super powerful AI functions are a single API call away, meaning the publisher’s discovery and search experience may also be able to retain quality traffic longer.
What do you make of The New York Times’ lawsuit against OpenAI and the wider trend of AI vendors training models on news content generated by human authors and journalists?In the future, I anticipate the government and even the private sector will reign in the wild west approach of anything on internet could be used as training data. Will this case of the New York Times vs. OpenAI be the milestone case for how content creators are compensated by AI companies? I have my doubts, but it certainly is two big names. It emerged after a content licensing agreement failed. Reading between the lines, if OpenAI was able to compensate the New York Times more, the case would have never made it to court. The verbatim regurgitation of another’s content is plagiarism. The New York Times knows this, OpenAI knows this, and even your grade school teacher knows this. I expect OpenAI to pay The New York Times more money before this is over, but I don’t see this case shaping the future for how internet content is licensed to AI training.
What do you make of AI news tools like Perplexity AI which offer news summaries with citations? (Do you think these can be useful tools or is there a concern they will take readers away from traditional news sites).Curating is a value add. Sometimes, especially if given reliable and detailed rules, AI can curate as effectively as some humans. AI is 100% changing how we search and research on the internet. I’m not even certain that Perplexity’s search differentiation is the use of AI as Perplexity has amazing design choices. Was not surprised to see Meta roll out a very similar scrolling topics homepage during the launch of Meta AI chat and for SearchGPT to use a similar design for displaying relevant sources. Google search still dominates the market, their use of generative AI in search results demonstrates that generative AI will be a part of the future of the internet search market.
How can news publishers use AI to enhance their operations?We use AI in a number of publishing systems across HackerNoon. Before publication, AI recommends headlines based on the story draft and past performance of HackerNoon stories. The humans still write the headlines better 95% of the time, but it’s nice to have the machines generate a few relevant options. We also use AI to better curate stories, like when we had to categorize 50k technology tags into 22 technology categories, it made more sense for an AI to make those 50k assignments than for human editors to do.
Do you have any comments on the impact of AI on SEO? More specifically, do you think it’s going to be harder to rank human-written content against AI-driven content in the future?AI content is not source material. Sources will always need to be cited and linked to. With the rise of AI generated search result summaries and the aggregate number of human to AI interactions increasing daily, it is becoming expected for search experiences to be supplemented by an AI assistant.
What would you say to writers and journalists who are entering the industry who are concerned about the future of news publishing as a whole?Don’t be afraid of competing with robots. The demand for authentic human stories is as high as it's ever been. As AI floods the internet with billions of pieces of bad content, mediocre content, acceptable-ish content and even some remarkable content, great storytellers will continue to rise above. Whenever the writer lives/d, there’s barriers to entry to acquiring readers. Most of human history there were no magazines or internet. Whoever had a quality story to tell, found a way to tell their story. If you have stories to write, there are more ways than ever before to acquire readers from around the globe.
Are there any other comments you’d like to add?We will always need human originality… because we’re humans, we crave human stories.
\ \ Also Published as “What is the Role of AI in the Newsroom? We Ask Hackernoon’s CEO”
\
All Rights Reserved. Copyright , Central Coast Communications, Inc.