Corante

About this Author
Ernest Miller Ernest Miller pursues research and writing on cyberlaw, intellectual property, and First Amendment issues. Mr. Miller attended the U.S. Naval Academy before attending Yale Law School, where he was president and co-founder of the Law and Technology Society, and founded the technology law and policy news site LawMeme. He is a fellow of the Information Society Project at Yale Law School. Ernest Miller's blog postings can also be found @
Copyfight
LawMeme

Listen to the weekly audio edition on IT Conversations:
The Importance Of ... Law and IT.

Feel free to contact me about articles, websites and etc. you think I may find of interest. I'm also available for consulting work and speaking engagements. Email: ernest.miller 8T gmail.com

Amazon Honor System Click Here to Pay Learn More

The Importance of...

Category Archives

« Privacy | Rating and Filtering | Recipe »

June 30, 2005

June 22, 2005

June 21, 2005

June 17, 2005

Microsoft on Chinese Censorship: We Censor in the US Too!

Email This Entry

Posted by Ernest Miller

The LA Times has a very good article on Microsoft's censorship of blogs in China, the background and the controversy (As China Censors the Internet, Money Talks). For more about this issue, I highly recommend Rebecca MacKinnon's My Response to Scoble.

But there was something that struck me about Microsoft's response to issues of censorship in China, according to the LA Times:

Microsoft adds that filtering objectionable words is nothing new. In the United States, the company blocks use of several words in titles, including "whore" and "pornography."
That's just great. What a fantastic way to show your support for freedom of expression, Microsoft. When people accuse you of censorship in China, justify your actions by proclaiming your support for censorship in the United States. I'm sure the Chinese government is very appreciative that you're implying a moral equivalence between China and the United States on questions of free speech.

Now this isn't recent news, BoingBoing pointed this out in December 2004 (MSN Spaces: Seven Dirty Blogs). But really, when you're defending your censorship policy in China, do you really want to brag about how you censor in the US, home of the First Amendment? Is this helpful? On any level?

And, you know, the policy is still asinine, as Xeni demonstrated so ably. Another, more recent example: if you wanted to discuss ICANN's new top level domain, .XXX, you wouldn't be able to put the .XXX in the title - which might result in some weird contortions of lanugage. And I guess some of the titles of my past posts would be too risque for Microsoft: PIRATE Act Reveals Sen. Hatch as Strange Ally of Pornography Industry; Little-Known Anti-Pornography Statute Threatens Free Speech; and The INDUCE Act (IICA) - Putting the Pornography Industry in Charge. Thank goodness I don't use MSN Spaces.

Microsoft probably doesn't have much of a choice with regard to censorship in China, but that doesn't mean they can't demonstrate a commitment to free speech. They could start by getting rid of their censorship policy here in the US. At the very least, they could stop bragging about it.

Comments (1) + TrackBacks (0) | Category: Freedom of Expression | Rating and Filtering

June 14, 2005

Podcasting and Profanity

Email This Entry

Posted by Ernest Miller

Last Sunday, on Corante's Podcasting Jeff De Cagna asked, what the role of profanity in podcasting was (Profanity in Podcasting: What is its Role?).

But there is an even more fundamental inquiry I'd like to pose here: what is the role of profanity in podcasting? Do we need to curse to demonstrate our fidelity to free speech? What is the point at which our defiant acts against the FCC will cease to be purposeful, and we will just become garbage mouths in the eyes (and ears) of our listeners? I know I'm probably messing with the bull here, so I'll be prepared! [emphasis in original]
The answer is simple, really. It plays whatever role the speaker desires. If that role doesn't mesh with the role the audience cares for, the audience will stop listening.

Use profanity, don't use profanity. It's a judgement call.

The real question is whether some censorship regime is necessary.

Last week on the Yahoo podcasters group, there was an extremely passionate discussion (complete with name calling) of profanity in podcasting and how it can be screened by listeners who prefer to avoid it themselves or want to keep it away from their kids. At the moment, of course, there isn't a way to screen for profanity short of listening to the podcasts. Some group members advocated a voluntary ratings system, while others recoiled at the suggestion. A key question is who gets to decide what is or isn't profane and by what cultural standard, an extremely relevant matter given podcasting's global reach. [link in original]
But really, is this necessary? The internet has a number of rating schemes, they're mostly useless. I've never noticed any blogs that are rated, why should podcasts? Depending on the audience, most blogs simply exercise a judgement call. Some refuse to publish vulgarities, others do. Sometimes the sites warn their readers, sometimes they don't. Seems to work just fine.

Of course, I'm sure the topic will come up again and again and again ...

Comments (2) + TrackBacks (0) | Category: Broadcatching/Podcasting | Freedom of Expression | Rating and Filtering

June 09, 2005

June 07, 2005

June 05, 2005

June 03, 2005

ICANN Doesn't Censor, Governments Do

Email This Entry

Posted by Ernest Miller

Everyone has commented on the initial reports concerning the new ".xxx" top level domain approved by ICANN. For example, see my initial post (ICANN Approves '.xxx' Top Level Domain). However, there is more to learn as the decision is analyzed.

Of course, those behind the domain are very concerned with the free speech issues involved. They've even brought well-known First Amendment attorney Robert Corn-Revere onboard as retained counsel. In a post to the ICANN ".xxx" thread, he defends the .xxx domain from charges that it will be made mandatory in the US (Legal Protections for the Voluntary Nature of the .xxx Domain).

But if the U.S. government tried to require use of a .xxx address by designated entities, such a regulatory scheme would likely be found to be unconstitutional.
Ok, so it would be unconstitutional to mandate use of ".xxx" in the US. But would it be unconstitutional to require libraries and schools to block access to ".xxx" on pain of not getting federal funds so long as a patron can view .xxx domains by asking specific permission? Properly drafted, probably not, following US v. ALA, the CIPA case.

Well, that wouldn't be too bad, would it?

Requiring websites to adopt the .xxx domain might be illegal in the US, but other nations *cough*China*cough* are not so legally restrained. If China, or another nation, decides to require certain websites to use the .xxx domain there is little that Corn-Revere will be able to do. Not a big deal? Well, if you're in a US library, suddenly you might not be able to access such a website. You could ask for specific permission, but you might not even know the site exists due to the filtering. And try sending email from a .xxx domain. Furthermore, the content of the site would be tarnished by its association with .xxx.

China's censors are experts at their work. If they can use .xxx to make their censorship more effective both internally and externally, they will.

Remember, poorly thought out changes to the registry system don't censor, governments do. It won't be their fault, let them assure you.

via Infothought

Comments (0) + TrackBacks (0) | Category: Freedom of Expression | Internet | Rating and Filtering

June 01, 2005

ICANN Approves '.xxx' Top Level Domain

Email This Entry

Posted by Ernest Miller

The Associated Press reports that ICANN has approved a new top-level domain ".xxx" (Internet Group OKs 'xxx' Web Addresses). Read ICANN's brief announcement here: ICANN Moves Forward in First Phase Commercial & Technical Negotiations with an Additional sTLD Applicant. ICM Registry is the lucky company that can sell the domain names for 10x the going price for a .com domain. Read their press release:ICANN Approves .xxx Sponsored Top-Level Domain Application.

An international cross-section of the responsible online adult-entertainment industry has expressed its support for the new TLD and indicated its willingness to work through the International Foundation for Online Responsibility (IFFOR), the Canadian non-profit, multi-stakeholder organization that is sponsoring the TLD to combat child pornography and to regularize business processes and to ensure that children and others who do not wish to access adult entertainment online can easily avoid it.
Great. It is voluntary now, but expect renewed efforts in Congress and certain of the states to require those with material "harmful to children" to register in the .xxx domain. Also, expect a rash of new trademark litigation, particularly with regards to dilution and tarnishment when the .xxx domain sale begins. And what is this about stopping child porn? How will a top level domain have any impact on that?
via How Appealing

UPDATE 2330 PT

Seth Finkelstein calls the new domain "pornographic titillation" (.XXX Domain Approved).

Techdirt links to a number of their previous postings on the concept (ICANN Finally Agrees To Build An Online Red Light District).

Comments (1) + TrackBacks (0) | Category: Internet | Rating and Filtering | Trademark

Hollywood Could Help Fight Child Porn, But They Don't

Email This Entry

Posted by Ernest Miller

The Dallas Morning News (reg. req.) runs a story that insinuates that P2P companies could stop child porn on their networks but are reluctant to do so because then they would also be able to stop copyright infringement (Child Porn Tests File-Share Firms). In other words, these companies are scumbags unwilling to fight child porn so they can profit from infringement. Subtext, the Supreme Court should rule against them in Grokster to protect us from child porn:

File-sharing companies could find ways to block known illegal files before they're sent, said Detective Greg Dugger, a member of the Dallas Police Department's Internet Crimes Against Children unit.

But then they'd probably have to do the same thing for copyrighted works, and they'd lose their users instantly, he said.

"If one of these clients does the right thing, they'll probably be out of business the next day," he said.

A Supreme Court ruling in favor of entertainment companies could be the way to make P2P companies aggressive about policing their own networks, Mr. Burbach said.

But if the court rules in favor of P2P firms, the industry may have to prove it's no haven for pedophiles, said Rick Wallace, a full-time student in Illinois.

His Web site, www.seewhatyoushare.com, tracks the ways consumers make themselves vulnerable through P2P software.

"At a certain point, when you have children being exploited on networks the way they are, something's got to give," he said.

I'm not even going to get into whether filtering would actually work. The facts of the matter seem too confusing for law enforcement:
Police who specialize in child porn cases consider P2P networks dangerous because they can disseminate information to many people very quickly.

The P2P networks also give users the misguided impression that they're completely anonymous.

Most popular P2P programs don't have a central repository of data tracking which users are sharing specific material.

Even so, it's possible, with the right tools, to identify P2P users.

Entertainment companies have developed and bought tools that can identify the Internet addresses of P2P users.

Law enforcement agencies have more limited budgets, but they're reviewing similar options.

Let's see, P2P networks give users the false impression of anonymity. Doesn't this mean it will be much easier to identify who is sharing child porn? Why would you want to change this impression? If you make P2P illegal or have obvious tracking, the child pornographers will only move to distribution means that are harder to track. Police should be thanking P2P companies for making it easier to catch child pornographers.

And why the focus on what P2P companies can do? It is Hollywood that has the tools to track file-sharing; they've sued over 10,000 people. So, why don't police ask Hollywood to help them fight child porn? Why isn't Hollywood sharing this technology? Seems to me that Hollywood could fight child porn if they wanted, so how come they're not?

File-sharing networks are one of many places on the Internet where pedophiles lurk.

They also transmit their images through chat rooms, newsgroups, e-mail and even Web pages. "You can find them just going through Google," Mr. Burbach [Texas deputy attorney general] said.

Hmmm. Why aren't they asking for ISPs to run filters to identify child porn files? Seems if you really wanted to stop all these methods, there is only one place to go: the ISP. Of course, if an ISP started filtering everything for child porn, it is likely they would lose their users instantly. So, clearly, ISPs are unwilling to fight child porn in order to maintain their profits.

But then, it is politically more acceptable to bash P2P companies rather than large well-financed ISPs.

UPDATE 0700PT 2 JUN 2005
The author of the article responds in the comments below.

Comments (7) + TrackBacks (0) | Category: File Sharing | Rating and Filtering

May 31, 2005

May 22, 2005

May 13, 2005

May 12, 2005

July 07, 2004

Filtering Out Blogs

Email This Entry

Posted by Ernest Miller

Prof. Michael Froomkin ran into some troublesome web filters while traveling overseas (Annoying Filter Update). Popular blogs such as Atrios and the Volokh Conspiracy were blocked because of naughty words. Censorware expert Seth Finkelstein points out how commentators could take advantage of this to get blogs they don't like blocked by filters (Censorware usable for blog Denial-Of-Service Attack?). Given that libraries and schools must use filters or be denied federal funding, this may not be that minor of a point.

Comments (0) + TrackBacks (0) | Category: Freedom of Expression | Rating and Filtering

April 05, 2004

The Speech Powell Should Have Given on Indecency

Email This Entry

Posted by Ernest Miller

Last week, I wrote an annotated version of two recent speeches, one by FCC Chairman Michael Powell and the other by Commissioner Michael Copps, in which they addressed (behind closed doors) the National Association of Broadcasters regarding indecency regulation (FCC Commissioners - No Free Speech Please, We're Americans). Frequent commentator Cypherpunk thinks that I was overly harsh with regard to Michael Powell, who formerly was a strong defender of freedom of speech in broadcasting (Too Rough on Powell).

Rather than simply rebut Cypherpunk, I've adapted Powell's speech to give my version of what he should have said at the NAB meeting.

The original speech is here:
Remarks of FCC Chairman Michael Powell at the NAB Summit on Responsible Programming, The Renaissance Hotel, Washington D.C., March 31, 2004 [PDF].

Read on for my revised version.

...continue reading.

Comments (0) + TrackBacks (0) | Category: Freedom of Expression | Rating and Filtering | Telecomm

December 11, 2003

Porn, Compulsories and Filtering

Email This Entry

Posted by Ernest Miller

As CNN notes in an otherwise slow newsday, porn is a popular business on the internet (Sex sells, especially to Web surfers). However, see Seth Finkelstein's dissection of the "report" CNN is relying on (N2H2 "State Secrets" - PR and lying with statistics [part 1]) and (CNN, "web porn", and censorware PR Managers).

Regardless of the validity of the report, it is undisputed that pornography is popular on the internet, including P2P networks (of course, porn has been popular in every medium). Note, that contrary to some claims, pornography hasn't been shown to be more of a problem on P2P networks than the internet generally as a leaked GAO memo obtained by TechNewsWorld concludes (U.S. Congress: P2P E-Smut 'Not Necessarily' More Dangerous than Other Forms).

In any case, the debate over compulsories has raised a serious barrier to their implementation - the political unpopularity of systems which will provide cross-subsidization for pornography. In other words, taxes (whether levy or general) would be collected and then distributed to pornographers. This would not be, to put it mildly, politically popular. Furthermore, I use the term "pornography" only as the most blatant example of content that would be politically unpopular. I can imagine, for example, that certain genres of music, such as "gangsta rap," would raise similar objections (how would people feel about tax dollars subsidizing music that glorifies cop-killing?). This is a serious problem and one that hasn't really been addressed by proponents of government mandated compulsories, especially given the track record of political debate over the relatively small amount of money dedicated to the National Endowment for the Arts.

Nevertheless, the issue of compulsories and pornography may create other problems as well. One I am concerned about is the potential for mandatory filtering to go along with the mandatory compulsories. Although none of the proposed compulsory systems speaks to the issue of filtering (and I am sure the proponents would oppose it), the systems certainly enable a mechanism that would make such filtering possible. All of the proposed government mandated systems envision some form of centralized registry for copyrighted works so that the works can be monitored and tracked and appropriately compensated. How much more of a step would it be to require works in the registry to also include self-labeling information?

I can imagine that many people would make the claim that, for example, pornographers shouldn't be compensated for having their files shared by minors. Two 15-yr olds file share a pornographic movie. Should the pornographer be compensated? If not, then the system will have to include self-labeling by the pornographer as well as parental controls (filters) in the file-sharing/playback devices. How will this work? Will political pressure force "voluntary" labeling schemes onto content producers who wish to be compensated? How will the survey/monitoring systems handle devices with and without filtering mechanisms?

No compulsory scheme advocates for labeling and filtering. However, we should consider likely ramifications of such compulsory schemes, and increased political pressure for labeling, whether "voluntary" or not, is likely.

Comments (2) + TrackBacks (0) | Category: Copyright | File Sharing | Freedom of Expression | Rating and Filtering

October 30, 2003

RIAA Now a Proponent of Rating and Filtering

Email This Entry

Posted by Ernest Miller

Billboard is reporting that the RIAA now intends to request that online music companies implement "effective parental-control filters to provide parents more information and control over what their children can download"(RIAA Pushing Advisory Warnings On Downloads). The announcement came at a FTC workshop dealing with the marketing of violent entertainment to children (Marketing Violent Entertainment To Children: A Workshop on Industry Self-Regulation).

Considering that the RIAA strongly resisted implementing any rating and labeling standards until essentially forced to by congressional scrutiny in the late 1980s, it therefore seems odd that the RIAA would now be recommending filtering. Indeed, of all the rating systems, the RIAA's is the most granular - a recording either has a parental advisory or it does not (Information for Parents - Parental Advisory).

So why now the push for filtering, which the RIAA of 20 years ago most likely would have strongly opposed? The answer is obvious - as a way to attack P2P. Note The answer is in the article:

[RIAA chairman/CEO Mitch] Bainwol said the RIAA's guidelines "will reinforce the importance of consistent descriptors across all services" and should "help parents draw a distinction between the pirate peer-to-peer networks and legitimate online music services. [emphasis added]"

In other words, the RIAA is hoping that, since unlicensed P2P systems are unlikely to have effective rating and filtering systems, parents will turn to licensed systems for music downloads and prevent their children from using unlicensed P2P networks.

I haven't been able to find the new standards for rating and filtering online yet, but it will be interesting to take a look at them.

Comments (0) + TrackBacks (0) | Category: File Sharing | Rating and Filtering