GateHouse: O hai, internetz — we r fail


With David Carr’s argument that newspapers should ignore the Web only a few days old — not to mention Joel Brinkley’s suggestion that anti-trust violations are a viable business model — I thought the market for stupid newspaper-related activity was pretty well saturated. But apparently I was wrong. It seems that GateHouse Media, which owns a number of regional papers in the U.S., is suing the New York Times for linking to its content. Yes, you read that correctly — it is suing to stop the NYT from linking.

I am not making this up. If this sounds like a court case that might have occurred in the early 1990s, when sites of all kinds were just getting used to the Intarwebs, that’s because it is virtually a carbon copy of some of those early cases (GateHouse isn’t the only one — the Associated Press tried asimilar tactic against the Drudge Retort this summer). The argument in a nutshell is that GateHouse is mad because the Times (or rather, the Boston Globe, which is owned by the same company) is “scraping” its headlines and the first paragraph of stories, and then “deep-linking” to the stories themselves, thereby copying the site’s content and stealing its traffic (as Mike Masnick at TechDirt points out, GateHouse is also apparently suing for breach of contract, because its articles are Creative Commons-licensed, but with a non-commercial license).

But surely this is the way the Internet works, you are saying to yourself. And so it is. GateHouse apparently doesn’t like the way the Internet works. That puts the company in the same category as the World Newspaper Association and forward-thinking types like Chicago Tribune owner Sam Zell, who have repeatedly criticized Google for linking to news stories from its Google News search engine, or the Belgian newspapers that sued Google over similar tactics. All of these groups are trying to turn back time, to play King Canute with the rolling wave that is the Web, instead of trying to find ways of using that wave to their mutual advantage.

From the way the lawsuit is described by Mark Potts at Recovering Journalist and elsewhere, it sounds as though some of this is about GateHouse and the NYT competing for traffic to their local blog and content sites. Regardless, as Henry Blodget at Silicon Alley Insider puts it, this is still a case of “dying newspapers suing each other.” What could be more pathetic? Update: I checked Howard Owens’ blog to see if he was talking about this, because he is (or was) the director of digital publishing for GateHouse, but he hasn’t updated his blog in a month or so.

Comments (18)

  1. Tish Grier wrote::


    you should take a look at Dan Kennedy’s post on this–he knows the Boston media scene better than anyone and has good insights (and PDFs) I also spoke with Dan, who got me thinking that Gatehouse *may* have a point if the only content is aggregating on one of their hyperlocal pages is only Gatehouse’s content, and they’re selling ads against it. Plus, we’d have to look at’s logs to see if they are indeed sending traffic over to Gatehouse’s site. We can’t just automatically assume that…

    Now, from a reader’s perspective, if I go to a page on a newspaper site–like the pages– that is supposed to be showing me hyperlocal content, and the only hyperlocal content it’s showing me is from another msm outlet (be it newspaper or TV), I’m going to think one of three things: that there’s no independent hyperlocal content (blogs)in the region; that the paper is lazy/greedy and doesn’t want to link to independent hyperlocal content; that they can’t find independent hyperlocal content because google sucks for geotagged content. The page is going to have no value to me because it is only another msm outlet.

    and yes, I work for a hyperlocal aggregation site– We do NOT have ads next to our aggregated content because we do not want to get into an issue of making money off of other people’s content. We are, though, still trying to figure out our revenue model beyond getting grants to continue our work.

    Tuesday, December 23, 2008 at 7:01 pm #
  2. mathewi wrote::

    Thanks for the comment, Tish. Dan’s post is a good one, and I have
    linked to it. And I agree that selling ads against someone else’s
    local content is not really kosher — but if it’s a headline and a
    short excerpt, then as far as I’m concerned it is fair game (and fair
    use). GateHouse should do the same with’s content.

    Tuesday, December 23, 2008 at 6:51 pm #
  3. yelvington wrote::

    “we'd have to look at's logs to see if they are indeed sending traffic” … nope. The linking site does not know whether anyone clicks. Only the linked-to site would have that information.

    Tuesday, December 23, 2008 at 2:12 pm #
  4. David Mastio wrote::

    Nice headline. Wish I wrote it.

    Tuesday, December 23, 2008 at 2:03 pm #
  5. mathewi wrote::

    Thanks, David — feel free to borrow it :-)

    Tuesday, December 23, 2008 at 6:52 pm #
  6. howard Weaver wrote::

    David Carr didn't argue that newspapers should ignore the web. He reported on a newspaper that does. Don't you recognize that difference?

    Tuesday, December 23, 2008 at 2:48 pm #
  7. mathewi wrote::

    Thanks for the comment, Howard. Yes, I recognize the difference. It
    seemed obvious to me that part of the reason why David wrote about it
    was that he thought other papers could learn from their example. Do
    you not agree?

    Tuesday, December 23, 2008 at 6:54 pm #
  8. gregorylent wrote::

    my me mine .. this concept is doing a fast fade, but it will take a bit more understanding about the nature of collective consciousness before copyright and ipr can follow along into the primitive past …

    Tuesday, December 23, 2008 at 3:02 pm #
  9. 300baud wrote::

    Whether or not to allow other sites to “deep link” to content is a policy decision that is within GateHouse's power to make. When they serve a page, they know what page the user came from. They choose to serve the page even when they know the request came from a “deep link” on They could just as easily redirect the user to the front page in this case.

    Of course that would be a petty and churlish thing to do, and if a lot of sites started doing it, browsers would start to hack around it. But it would still state the intent, and they'd have a better argument.

    Tuesday, December 23, 2008 at 5:38 pm #
  10. Iria wrote::

    Remember that earlier this year the AP made a big stink over linking as well. That one blew over, but here we go again.

    Tuesday, December 23, 2008 at 6:40 pm #
  11. mathewi wrote::

    That;s a good point, Iria — I added a link to a piece about that whole affair. I thought that this kind of debate had been settled already, but apparently not.

    Tuesday, December 23, 2008 at 7:02 pm #
  12. The guy who runs's “Your Town” hyper-local sites used to be a VP at Gatehouse's WIcked Local. So perhaps this is more than just about linking :) ..

    Wednesday, December 24, 2008 at 1:37 am #
  13. mathewi wrote::

    Hmmm — that is an interesting twist, Sachin. Thanks for pointing that
    out. There is definitely a competitive dynamic going on here.

    Wednesday, December 24, 2008 at 9:14 am #
  14. test wrote::

    Drudge RETORT not Drudge Report….

    Wednesday, December 24, 2008 at 8:10 am #
  15. mathewi wrote::

    Yes — Drudge Retort not Drudge Report. Thanks.

    Wednesday, December 24, 2008 at 9:14 am #
  16. Mathew Ingram wrote::

    testing Facebook integration

    Wednesday, December 24, 2008 at 12:08 pm #
  17. Faizal Rahman wrote::


    Saturday, December 27, 2008 at 3:03 am #
  18. “GateHouse apparently doesn't like the way the internet works” that is just so primitive of them. They should just turn things around to their advantage by using the net rather than go against it.

    Sunday, December 28, 2008 at 5:08 am #

Trackbacks/Pingbacks (6)

  1. jayrosen_nyu (Jay Rosen ) on Wednesday, December 31, 1969 at 11:59 pm

    Mathew Ingram: Come on, GateHouse. This is the way the Web works. But Tish Grier says: not so simple.

  2. A link too far? | Kiesow 7.0 on Tuesday, December 23, 2008 at 2:32 pm

    […] GateHouse Lawsuit vs. New York Times Co. has Dire Implications A Danger to Journalism GateHouse: O hai, internetz — we r fail […]

  3. BuzzMachine » Blog Archive » A danger to journalism on Tuesday, December 23, 2008 at 6:01 pm

    […] LATER: Matthew Ingram has a v good response to the dustup: With David Carr’s argument that newspapers should ignore the Web only a few days […]

  4. GateHouse: Don’t Link Me, That’s Wicked! on Wednesday, December 24, 2008 at 1:26 am

    […] that websites link to each other and that’s how the web works. This case is reminiscent – as Mathew Ingram points out – of court cases that occurred in the early 1990’s when people were only beginning […]

  5. Printed Matters » The web abhors a vacuum on Monday, January 5, 2009 at 8:36 pm

    […] newspaper companies and their owners, managers, and website developers continue to espouse and implement website strategies that cut against the grain of what the web is all […]

  6. […] GateHouse filed its original complaint in November, the company was roundly criticized across the blogosphere for pushing back against the culture of linking that has come to define online journalism. Most […]