August 10, 2009
In copyright law, the big news is always made by cases such as the Jammie Thomas verdict, the Tenenbaum trial or even The Pirate Bay trial in Sweden. As importance as these cases are, their legal applicability to the average person is dubious, especially since the RIAA has stopped suing file sharers.
For the cases that could have a direct impact on your life, you often have to dig deeper. This is true for the case of Brayton Purcell LLP v. Recordon & Recordon, a seemingly dull case about two law firms in a dispute over content posted on their respective Web sites.
However a recent decision by the 9th Circuit Court of Appeals in the case, if upheld by other circuits or the Supreme Court, could have a drastic impact on the way copyright issues are litigated in the United States.
How big is the difference? The dissenting judge on the panel said the following, “Under the majority’s opinion, every website operator faces the potential that he will be hailed into far-away courts based upon allegations of intellectual property infringement, if he happens to know where the alleged owner of the property rights resides.”
In short, if you are accused of copyright infringement, it is no longer safe to assume that you would be sued in your own district, but rather that you could be forced to litigate in the plaintiff’s court, enduring the extra costs and expense that comes with it. read more
Tags: content theft, copyright, copyright infringement, jurisdiction, Legal, plagiarism
March 30, 2009
Three weeks ago I signed up my blog for a beta service by Tynt called Tracer in an attempt to both test the service and get a better understanding of how people are using my content.
The information provided by Tracer is only aggregate in nature, there is no information about what an individual user did with your content, and Tracer does nothing to prevent copying, thus it is not a DRM solution. All Tracer does is analyze how users interact with your content and which pages are the most “active”.
To do that, Tracer follows four metrics: page views, selections (meaning when someone selects objects), copies (actually copying the work) and generated traffic (clicks on links generated by Tracer).
After over three weeks of running the service, I’ve gotten some pretty good data on my site and the results more than surprised me. Here is what I learned. read more
Tags: copying, copyright, metrics, plagiarism, statistics, tracer, tynt
February 18, 2009
I’m working on my annual Things I Want Gone from the Web article and I’ve personally designated this “The Year of Original Content.” We’re done playing around with feed scraping and autoblogging.
The blog echo chamber effect of someone blockquoting and linking the same content as a recommendation, echoing through the web without original content, is a beginner’s mistake. Don’t do it. Always add your original voice and content to your recommendations, telling your readers why it is important to leave this blog and go to another, then come back for more.
Google took action to penalize duplicate content within a site and between sites, and added bonus points for original and unique, appropriate and relevant keywords around links, especially link lists, rewarding original content providers with nicer PageRank scores. Similar actions are being taken by other major search engines, directories, and legitimate content aggregators.
As a serious blogger, you’ve learned the lesson and stay focused on creating original content. You link to other people’s content appropriately, taking care to protect their copyrights and not confuse your reader’s, putting other people’s content in blockquotes with clearly indicated links and credits.
For scammers, scrapers, and plagiarists, other people’s content has turned into a major money-maker as they use other people’s content for financial gain and misdirection. read more
Tags: blog security, copyright, copyright violation, DMCA, duplicate content, original content, plagiarism, report copyright violations, scam, scams, scraper, scraper blogs, Security, spam blogs, splog, splogger, stop plagiarism, year of original content
February 16, 2009
Back in November, Attributor released a study that many Webmasters and content providers intrigued. According the report, for many Websites, most of the viewings of their content do not happen on their page or their RSS feed, but on other sites.
Earlier this month, the same company announced the public beta of its new product, FairShare, a free service designed to help help bloggers track their’s content’s usage, check for license compliance and understand who is using their works and how.
Though the service has some limitations, it can be a valuable tool for bloggers to get a glimpse at how their content is used on the Web and where some of their untracked readers may be hiding. read more
Tags: attributor, content theft, copyright infirngement, fairshare, plagiarism, rss
January 19, 2009
It seems that every week a new product or service is announced that promises to protect your work in some way or another. Whether it is helping you “register” your copyright, detect plagiarism or even outright prevent infringement, there are tons of companies that want to take your money to protect your work.
However most of these products turn out not to live up to their hype. At best they are a waste of time, at worst they are an outright scam.
So who is out to scam you and who is here to help? Well, here are some of the more common types of copyright protection services and what you should look out for before you sign on the dotted line. read more
Tags: content theft, copyright, copyright law, copyright registration, plagiarism, usco
January 5, 2009
Bloggers use, and don’t use, Creative Commons Licenses for a variety of reasons. Some feel that it is a great way to give back to the community, others use CC licensing as a form of promotion, encouraging their content to be used with attribution, and others feel that it is a way to promote copyright reform.
However, Creative Commons can actually provide bloggers benefits that go well beyond the buttons and badges. In the uncertain copyright climate of the Web, having a firm lawyer-written license, regardless of what it says, can have huge benefits over the ambiguity that comes with not having one.
Here are just five less-promoted ways that choosing a CC license can help you, your site and your content, even as you surrender some of your rights in a particular work. read more
Tags: content theft, copyright, copyright infringement, creative commons, fair use, plagiarism
December 8, 2008
Yesterday, on CenterNetworks, Allen Stern reported on a new social news site, Social|Median. The story, however, didn’t center around Social|Median’s features or capability, but instead on how, according to Stern, it “take(s) content from around the Web, put it onto Socialmedian and let you comment about it.”
Though I did not see any widespread copying of content on the links that I checked (example), it appears that the amount of content copied in the snippet is determined by the user posting the link, not the site.
Still, it is clear that there has to be a balancing act between social media and content creators. Though social news sites need to use some of the content and conversation from the blog in order to properly function, if they take too much, there is nothing left to encourage content creators to participate or permit their works to be used.
Finding this balance is tricky and has been a problem that has plagued social news sites since the beginning. Many sites have faced criticism for “scraping content” or “fragmenting the conversation” and the concern remains at the top of mind of many Webmasters, especially when dealing with new social news sites that do not drive significant traffic.
So how should social news behave? The last is not very clear but the standards on the Web seem to have spoken to at least some degree. read more
Tags: aggregation, content theft, copyright, copyright infringement, plagiarism, rss, scraper, scraping, social news
September 29, 2008
When Steven Carrol of The Next Web admitted to using a content generation service known as Datapresser, reportedly after seeing it used by an unnamed author at TechCrunch, he seemed to indicate that it was the future of mainstream blog publishing.
But while there is no doubt that at least some mainstream blogs use content creation tools to aid in meeting their deadlines, content generation has found a much more comfortable home with another group, spammers.
Creating content from nothing has always been something of a holy grail for spammers. Traditionally, filling their junk blogs has required scraping content from article databases, other blogs (usually without permission) or other sources. This has made them easy for search engines to spot and also drawn the ire of many bloggers who have had their content reused.
But technology is advancing and content generation is becoming increasingly practical. Many spammers have already moved to it and it seems likely that others will follow soon. This has some strong implications for both the future of spam and the Web itself. read more
Tags: content theft, copyright infringement, plagiarism, rss scraping, scraping, Spam, spam blogs, Splogs
September 2, 2008
A blog post linking to one of my blog posts has been scraped dozens of times. Recently, it was scraped by eight different sites in the same day. The eight trackbacked sites turned out to have a single owner/webmaster using their auto-blogging scraper across multiple splog sites. I’ve let the blogger know – after the second time it happened – and now that it’s happened multiple times, it’s time to change strategies.
It’s now time to work together.
Have you received multiple trackbacks over time from an blog post with a link to yours and the investigation finds that it isn’t the original site but a scraper? What do you do? read more
Tags: blog content, blog writing, Blogging, Content Scraping, copyright, copyright violation, jonathan bailey, plagiarism, plagiarism today, scrappers, splogging, Splogs
June 2, 2008
Fighting spam has proved to be a nearly impossible task.
The best and brightest minds of the legal and technical worlds have failed to come up with solutions to stem the flow of junk email, splogs or spam comments.
Every new law or technological advancement has just been an escalation in a never-ending arms race between the many who hate spam and the few that send it out.
To be certain, spam plays a much smaller part in our lives today than it did a few years ago. We rarely see spam in our inboxes, spam comments are largely filtered out and only search spam seems to work with any reliability, especially with blogs.
However, the junk content keeps flowing at an ever-increasing rate. More and more junk email gets sent out every year, comment spam is on the rise too.
We have managed to treat the symptoms, but not the illness. This is because we have been dealing with how spam mails us, one issue at a time rather than looking at the bigger picture.
It is time to take a look at the spam puzzle and how it all fits together.
Tags: copyright, Ethics, Legal, plagiarism, Spam, Splogs, Technology