NLTimes – our final starting point for a lifelong journey.

The first step may be the hardest, but remember it takes many more steps to make a journey.

In the past two months, we have often talked about the role of the audience in modern journalism. It shouldn’t therefore come as a surprise that during our final fact checking project, we came to realize that it is up to the audience to decide if they want to use this medium or not. Our advice is to use it as a starting point, but to be aware that you need to look beyond the information presented to you there in order to get the full picture.

The Medium: NL Times 

Founded in 2013, NL times (http://www.nltimes.nl/) is a English-writing news medium based in the Netherlands, it focuses on Dutch news and writes short but snappy reports about different aspect of the country, including politics, business, sports, health, and weird news. With three international students in our group, it was the logical medium to choose because it is a good starting point for foreigners to know what’s going on in the country  they currently live in.

The Articles:

  1. No sick days for over half of Dutch employees by Ingrid Grinstad, published on 2014-11-24.
  2. Dangerous levels of B6 found in multivitamins by Janene van Jaarsveldt, published on 2014-11-24.
  3. Psychological problems cost Dutch business 20 billion euros by Janene van Jaarsveldt, published on 2014-12-02.
  4. Over 19 pct. Drop in Auto sales by Ingrid Grinstad, published on 2014-12-03.
  5. New police reports filed with digid by Janene van Jaarsveldt, published on 2014-12-01.

“No sick days for over half of Dutch employees”

The fact that the article contains so many statistical claims, immediately raises questions in Wernard’s head about the validity of those claims. After crosschecking the sources, he found that two out of nine claims were unconfirmed. The editor was willing to change one of them but judged the other a semantic issue. Regarding the choice for the title and the way NLtimes reporters write stories from large data reports he said the following:

“The choice of the title is because in our opinion it is the most interesting piece of information that we could accurately write in a fairly short headline that fits the design of our website.

We choose what to write based on what we find most interesting, what we consider to be most relevant, and sometimes with a consideration for details that may play out long-term.

Dangerous levels of B6 found in multivitamins

Gabriele found that the article manages to provide quite accurate information about vitamin B6. Most facts presented in the article are valid with additional possibilities to be more accurate. Nonetheless, it is important to stress out that small accuracy mistakes count. The opening sentence, which is quite strong and brings negative connotation to the readers, can be considered false and misleading as it might signalise false message about the vitamin industry. The headline and the text match well but evoke the question if those dangerous levels are found in the Netherlands, Europe, or somewhere else. Moreover, some key words indicate vague message as the journalist could have indicated what kind of vitamin pills were examined and which of them indicated excessive level of B6. Had the journalist provided another source and point of view, a more solid foundation for a trustworthy article would have been created. Interestingly enough, the journalist doesn’t mention that this article is a direct translation of the press release provided by the Consumentenbond. By refraining from providing a link to the actual results of the research, the article gives an impression that the journalist did not find it important to disclose which multivitamins are “dangerous”. All in all, this article brings a deeper level to the question regarding trustworthiness of the source used in the article. Even if the results mentioned in the article match the results in the research, it does not valid that they are correct. This kind of investigation would overstep fact-checking process as additional research would be needed. This press release raises the question what was the motivation to conduct this research and publish a press release. This could be a great idea for future investigation of the same topic, research reliability of Consumentenbond.

Psychological problems cost Dutch business 20 billion euros” 

April found that the article retrieved only partial information form the original OECD report, which resulted in a biased story. Visualization and biased words also contributed to the prejudged statement.

Not mentioning sources puts the journalists’ credibility at risk, especially when they involved a politician in their report. Providing a scenic photo of the event might be the most common solution to deal with the political thing. Providing the link and referring to the source would have been an effective approach to avoid this risk.

In short, this article gave basic but inaccurate information. Framing is used to represent the attitude of the journalist that leaves April to ask the following question:

“How to demonstrate a standpoint that represents the comparative information in line with the original sources when a journalist write a news based on a specific study?”

Over 19 pct. Drop in Auto sales

Than found this article to basically be a summary of different reports from different organizations. The information was collected online and the statements were quoted from somewhere else. There are two similar articles from other media were posted one day before.

The articles share something in common, especially the data. However, the author of the article being fact checked misused an important number. There is a reasonable suspicion on the journalist that she has not read the original report from AUMACON, and she just copied the words from the second article above and misunderstood the meaning of the data. The author analyses some possible reasons of the phenomenon, and she claimed that what she said was quoted from some trustable sources. However, the original sources of the quotes cannot be found. There are also no enough supports to the journalist’s statements in this article. Overall, there is no original researches done by the author, and it did not offer a valid level of analysis to the issue. The validity of the statements were not proofed. There are many mistakes in this article, for example the use of the data. In summary, there is a suspicion on this article of misleading readers due to the big attracting number which was wrongly used in the beginning and the unconfirmed claims that the author made in the article.

New police reports filed with digid

This short article left Wineke with more questions than sentences. What was the background of these bold claims made about past and present problems and future improvements? With no links or source given, she had to look for other articles on the same topic and found several government publications that answered her questions. One was a report about current problems published last summer by the Inspection of Safety and Justice, along with a promise by the minister to make improvements, the other an official statement by the national police department. She suspected the latter had been used as the source, and that much information was lost in translation. A mail from the editor confirmed and explained this:

In our opinion, we source the information to the police department. Perhaps this could have been clearer, but we are satisfied with the work. Further, some articles are long, some are short, and that is a decision made at an editorial level based on many factors.

It would be wise to also keep in mind what other news stories are going on that day when considering another story’s word count. (…) you can safely say that December 1, 2014, was an extremely busy news day. We marshalled our resources and placed more emphasis on the stories mentioned above.

Conclusion

Our analysis has lead to some interesting findings, both positive and negative.

On the negative side:

  1. Source reliability was questionable or unclear at times.
    • Although it is an online medium, no direct links were given to any sources which we found odd.
  2. Unvalidated information was presented as valid.
    • Articles contained statements or information that was incorrect or not accurate enough.
  3. Grammar and spelling error were found.
    • Apart from making the site look bad, it could mislead users.

On the positive side:

  1. NLTimes has been quick to accept corrections when errors were pointed out.
    • We have noted before how important it is for media to acknowledge their mistakes these days.
  2. NLTimes delivers news in a clear and fast manner.
    • Given how it means to be a place where foreigners can quickly check the Dutch news, this is important.
    • However, this is likely the reason why some articles were too short or biased to convey the full story.

Our advice to readers of NLTimes is therefore that it is a good site to go to if you want to know what topics are currently discussed in the Dutch news media. However, in order to get the full story it is best to engage in conversation with the people around you.

The truth, the whole truth, and the occasional fiction

“What event in the news did you read last week that turned out to be incorrect?”

In an ideal world, answering this question would require significant thought, research, and fact checking. In reality, I asked this question over diner and my sister-in-law provided me with an immediate answer: “I’d read Cesar Milan, the dog whisperer, had died but that wasn’t true.”

I’m not sure which of the following points scares me most:

  1. A casual news reader with no interest in journalism can immediately provide a recent example.
  2. My first reaction being ‘Oh, that’s just one of those silly Internet death hoaxes.”
  3. Even articles about the dangers of misleading data journalism can themselves be misleading.

The answer is that it probably depends on which cap I’m wearing, so I’ll address them all below.

Articles about the dangers of misleading journalism can themselves be misleading.

In it’s article  ‘The Power of Data Journalism‘, the Harvard Political Review writes the following:

“Many prominent media outlets such as the New York Times unintentionally misreport data predictions when they report to the general public. For example, this article falsely asserts that Nate Silver has “already decided the election.”

The referred CNN opinion piece however actually poses a question:

On Election Day in 1980, when news outlets reported early that it looked like Reagan was going to beat Carter, voter turnout in California dropped 2%. Now we’re reporting the results weeks, even months, before voters show up at the polls. Why get excited about voting? Nate Silver has already decided the election, right?”

As a student of journalistic data analysis, this point concerns me the most. However, it shouldn’t surprise anyone who has followed this course since right from the start we were told that the principles of verification are timeless and can be applied to any situation” (Steve Buttry). Evidently, this is also true for articles about data journalism itself.

Internet Death Hoaxes are common place

Exactly when did it become normal to consider fake death reports common place? They have been around for decades. In 1969 rumors started to surface about the death of Paul McCartney, but in the last few years they have become so common that they’re now an Internet meme. This troubles me as a human being. Death should not be treated as a joke. During the last year, my mom suffered from cancer and every time a death came in the news it reminded us of our impending loss. Journalists in my opinion should therefore be careful before publishing this type of news, not only for the fans but those who’re struggling with loss themselves.

Readers are used to reading incorrect news all the time

It is said that non frequently occurring events tend to be remembered better than everyday events. Unfortunately, both my sister-in-law and guest lecturer Carel van Wyk presented us with recent examples. This suggests that this happens far too often. Carel van Wyck gave us four types of unreliable news:

  1. Reliable of information
  2. Reliability of wording
  3. Reliability of sourcing
  4. Reliability of visualizations

And it wasn’t so hard to find examples of all of them.

Reliability of information

The Harvard Political Review article is a good example of this. By presenting a question in an opinion piece as an assertion, it misrepresented not only CNN’s point, but their own as well. It also didn’t help that they used CNN as an example straight after name-dropping the New York Times.

Reliability of wording

“The Best Holiday Shopping Partner: A Capuchin Monkey”

When reading this headline at Fivethirtyeight.com I couldn’t resist clicking on the link, but I was disappointed. Instead of an organization which rented out monkeys for the holidays, it reported recent research which showed that capuchin monkeys cared less about the prize of things than humans did. An interesting article for sure, but in my opinion this headline didn’t live up to its expectations.

Reliability of Sources

One of the news sites that broke the false news of Cear Milan’s death was Distrita, which calls itself an independent “new and fresh magazine portal for electronics, travel, media and lifestyle.” In their apology, they offered the following explanation:

“Its a trend to post news about people die on social networks and my source Noticiasunam made me think its true.”

Noticiasunam is a satire site, and very open about this. It showed that the people behind Distrita were so eager to publish this news, that they did not think to check their source. This is unlikely to happen again in the future, but for now a good example of how sources can prove to be very unreliable, particularly when they set out to present false information.

Reliability of Visualizations

For this example, look no further than the one given in my second blog, which mentions how fivethirtyeight.com created a very misleading visual map by not verifying their data.

Verification and reliability go hand in hand

All four examples given here could have been solved simply by properly verifying the information that was given or presented. Journalists aren’t perfect and media struggle with deadlines, but in the end… what matters more to the public? A reliable medium that allows you to check the news without wondering whether it is all correct? Or one that posts all the exciting rumors straight away and gives you lots of gossip, but not enough facts? The divide used to be clear, but with the onset of the Internet, journalists need to become more aware that if they want to belong in the former category, they should take the time to verify everything. Or be honest when they can;t.

The human factor – why facts are a work in progress

Fact checking is the bread and butter of journalism. The first principle of professional ethics in journalism is people’s right to true information. This leads right into the second principle: the journalist’s dedication to objective reality. In short, journalists are expected by everyone, including themselves, to verify every statement they publish.

In practice, this doesn’t always happen which is why we’re used to editorial comments and corrections in news papers and on internet news sites. There are various reasons why journalists don’t always check the facts. The following three justifications were given in a recent Dutch study:

  1. Explicit accordance:
    1. We followed the rules, so it’s not our responsibility if our report turns out to be false after all.
  2. Practical accordance
    1. Not enough time, resources, or money was available to check all the facts.
  3. Exceptional divergence
    1. there was no reason to check the facts because this clearly wasn’t real news.

Speaking as a programmer, these justifications sound all too familiar. If you ever wonder why computer programs you paid good money for have bugs in them, just take a look at these justifications above. They are as true for programmers as they are for journalists. In fact, they are probably true for most professions since these three justifications can be summed up as follows: Humans aren’t perfect.

Unrealistic expectations 

One of life’s ironies is that imperfect people expect others to be perfect. In fact, they often expect themselves to be perfect… provided the world around them cooperates. And when they don’t, well… clearly this isn’t their fault. Personally, I don’t object to that point of view as it leads to one very important thing: humans strive for the unattainable.  Objectively, the two principles above may be unattainable, but in practice most journalists do strive to adhere to them and are  held to account by both public and editors if they fail to do so.

But what about facts?

If people have unrealistic expectations about journalists, what about facts? Do we hold unrealistic expectations about them as well? In my opinion, yes.

According to Dictionary.com, a scientific fact is defined as follows:
“any observation that has been repeatedly confirmed and accepted as true; any scientific observation that has not been refuted”

This works well in science, but obviously not so much in the real world where some events only happen once and cannot be observed from afar. Journalists have found ways to deal with these limitations, such as the rule of thumb that a story isn’t published unless it has been confirmed by two independent sources.

However, data journalism deals with data and thus allows its professionals to take a more scientific approach to facts, right? No matter how many times and how many people run the same numbers, the result should always remain the same. Unfortunately, the numbers aren’t the starting point but the end result of a human process.

Example: Figures from the Dutch Ministry of Education

Each year, the Dutch ministry of education presents the numbers of youths who leave school without a starting qualification. The numbers for each previous school year are to be handed in by the regional and local authorities before november first. Meanwhile, the national ’Service of Education Execution’ (otherwise known as DUO) also hands in the number of student registrations that are sent to them by each school. Based on these numbers, the ministry officials calculate the national, regional, and local numbers that are used by the government to determine policy.

On january 16, the Dutch government published the following statement on its website (translated from Dutch:

“The number of premature school leavers has dropped significantly during the last school year to just 27.950 youths…. … On the one hand, this decrease is due to the combined efforts of schools, local governments, and other partners. On the other hand this decrease is due to the better measurements which clarify which youths truly leave school prematurely.”

The government admits that a new statistical measurement allowed these numbers to drop. However, what they don’t admit is an observation that was made a few weeks ago by someone at a meeting between DUO and the programmers which develop the software used by local governments:

“The problem with the government is that when they ask a question, they expect an immediate answer. They don’t understand that the data first needs to be gathered.” 

In other words… the numbers presented here are due to people being told that they needed to register their work.  However, people from different areas used different software programs. That is why the government and the different software companies got together in order to try and determine what exactly needed to be registered. But this was of course only one difference. The main difference was the different work processes used by the various local authorities. A big city in the west of the country for example automatically excluded youths that had found a job. A big, underpopulated region in the east automatically sent mail every 6 months to verify whether employed youths without a diploma didn’t want to get back into school after all. Different circumstances require not only different actions, but different terms as well. And that lead to different interpretations of the same questions.

Facts are a work in progress

If a good fact checker is a good reporter, how is he or she to deal with this issue? I’m afraid I’m just going to have to repeat myself from earlier columns: keep the bigger picture in mind and delve deeper into the subject. Do not accept the facts, but ask yourself how they came about and be open about these questions. Both to yourself and the audience.