GateHouse: O hai, internetz — we r fail

by Mathew on December 23, 2008 · 24 comments

With David Carr’s argument that newspapers should ignore the Web only a few days old — not to mention Joel Brinkley’s suggestion that anti-trust violations are a viable business model — I thought the market for stupid newspaper-related activity was pretty well saturated. But apparently I was wrong. It seems that GateHouse Media, which owns a number of regional papers in the U.S., is suing the New York Times for linking to its content. Yes, you read that correctly — it is suing to stop the NYT from linking.

I am not making this up. If this sounds like a court case that might have occurred in the early 1990s, when sites of all kinds were just getting used to the Intarwebs, that’s because it is virtually a carbon copy of some of those early cases (GateHouse isn’t the only one — the Associated Press tried asimilar tactic against the Drudge Retort this summer). The argument in a nutshell is that GateHouse is mad because the Times (or rather, the Boston Globe, which is owned by the same company) is “scraping” its headlines and the first paragraph of stories, and then “deep-linking” to the stories themselves, thereby copying the site’s content and stealing its traffic (as Mike Masnick at TechDirt points out, GateHouse is also apparently suing for breach of contract, because its articles are Creative Commons-licensed, but with a non-commercial license).

But surely this is the way the Internet works, you are saying to yourself. And so it is. GateHouse apparently doesn’t like the way the Internet works. That puts the company in the same category as the World Newspaper Association and forward-thinking types like Chicago Tribune owner Sam Zell, who have repeatedly criticized Google for linking to news stories from its Google News search engine, or the Belgian newspapers that sued Google over similar tactics. All of these groups are trying to turn back time, to play King Canute with the rolling wave that is the Web, instead of trying to find ways of using that wave to their mutual advantage.

From the way the lawsuit is described by Mark Potts at Recovering Journalist and elsewhere, it sounds as though some of this is about GateHouse and the NYT competing for traffic to their local blog and content sites. Regardless, as Henry Blodget at Silicon Alley Insider puts it, this is still a case of “dying newspapers suing each other.” What could be more pathetic? Update: I checked Howard Owens’ blog to see if he was talking about this, because he is (or was) the director of digital publishing for GateHouse, but he hasn’t updated his blog in a month or so.

  • Pingback: A link too far? | Kiesow 7.0

  • http://www.mathewingram.com/work mathewi

    Thanks for the comment, Tish. Dan’s post is a good one, and I have
    linked to it. And I agree that selling ads against someone else’s
    local content is not really kosher — but if it’s a headline and a
    short excerpt, then as far as I’m concerned it is fair game (and fair
    use). GateHouse should do the same with Boston.com’s content.

  • http://www.mathewingram.com/work mathewi

    Thanks, David — feel free to borrow it :-)

  • http://www.mathewingram.com/work mathewi

    Thanks for the comment, Howard. Yes, I recognize the difference. It
    seemed obvious to me that part of the reason why David wrote about it
    was that he thought other papers could learn from their example. Do
    you not agree?

  • http://spap-oop.blogspot.com Tish Grier

    Mat–

    you should take a look at Dan Kennedy’s post on this–he knows the Boston media scene better than anyone and has good insights (and PDFs) I also spoke with Dan, who got me thinking that Gatehouse *may* have a point if the only content Boston.com is aggregating on one of their hyperlocal pages is only Gatehouse’s content, and they’re selling ads against it. Plus, we’d have to look at Boston.com’s logs to see if they are indeed sending traffic over to Gatehouse’s site. We can’t just automatically assume that…

    Now, from a reader’s perspective, if I go to a page on a newspaper site–like the Boston.com pages– that is supposed to be showing me hyperlocal content, and the only hyperlocal content it’s showing me is from another msm outlet (be it newspaper or TV), I’m going to think one of three things: that there’s no independent hyperlocal content (blogs)in the region; that the paper is lazy/greedy and doesn’t want to link to independent hyperlocal content; that they can’t find independent hyperlocal content because google sucks for geotagged content. The page is going to have no value to me because it is only another msm outlet.

    and yes, I work for a hyperlocal aggregation site–Placeblogger.com. We do NOT have ads next to our aggregated content because we do not want to get into an issue of making money off of other people’s content. We are, though, still trying to figure out our revenue model beyond getting grants to continue our work.

  • http://www.blognetnews.com David Mastio

    Nice headline. Wish I wrote it.

  • http://www.yelvington.com/ yelvington

    “we'd have to look at Boston.com's logs to see if they are indeed sending traffic” … nope. The linking site does not know whether anyone clicks. Only the linked-to site would have that information.

  • howard Weaver

    David Carr didn't argue that newspapers should ignore the web. He reported on a newspaper that does. Don't you recognize that difference?

  • http://www.postlinearity.com gregorylent

    my me mine .. this concept is doing a fast fade, but it will take a bit more understanding about the nature of collective consciousness before copyright and ipr can follow along into the primitive past …

  • Pingback: BuzzMachine » Blog Archive » A danger to journalism

  • 300baud

    Whether or not to allow other sites to “deep link” to content is a policy decision that is within GateHouse's power to make. When they serve a page, they know what page the user came from. They choose to serve the page even when they know the request came from a “deep link” on Boston.com. They could just as easily redirect the user to the front page in this case.

    Of course that would be a petty and churlish thing to do, and if a lot of sites started doing it, browsers would start to hack around it. But it would still state the intent, and they'd have a better argument.

  • Iria

    Remember that earlier this year the AP made a big stink over linking as well. That one blew over, but here we go again.

  • http://www.mathewingram.com/work mathewi

    That;s a good point, Iria — I added a link to a piece about that whole affair. I thought that this kind of debate had been settled already, but apparently not.

  • Pingback: GateHouse: Don’t Link Me, That’s Wicked!

  • http://republicofinternets.com Sachin Balagopalan

    The guy who runs boston.com's “Your Town” hyper-local sites used to be a VP at Gatehouse's WIcked Local. So perhaps this is more than just about linking :) .. http://bit.ly/K29h

  • test

    Drudge RETORT not Drudge Report….

  • http://www.mathewingram.com/work mathewi

    Hmmm — that is an interesting twist, Sachin. Thanks for pointing that
    out. There is definitely a competitive dynamic going on here.

  • http://www.mathewingram.com/work mathewi

    Yes — Drudge Retort not Drudge Report. Thanks.

  • http://www.facebook.com/profile.php?id=504597536 Mathew Ingram

    testing Facebook integration

  • http://www.facebook.com/profile.php?id=619335545 Faizal Rahman

    cool

  • http://www.findknowledge.net Find Knowledge

    “GateHouse apparently doesn't like the way the internet works” that is just so primitive of them. They should just turn things around to their advantage by using the net rather than go against it.

  • Pingback: Printed Matters » The web abhors a vacuum

  • Pingback: Howard Owens: “They would probably win on that one” » Nieman Journalism Lab » Pushing to the Future of Journalism

Older post:

Newer post: