Policing the Internet: Jake Baker and beyond

March 9, 1995


Questions

PROFESSOR LOWENSTEIN:

I think because the opening speeches went a little longer than we had anticipated, I sense some restlessness among the natives, and I want to open it up to questions from the audience right away.

AUDIENCE PARTICIPANT:

I'm Matt Pulliam from Ford Motor Company.

I'd like to know what Mr. Charney says for a couple of questions regarding, a couple of years ago in Massachusetts, Phil Zimmermann developed the first widespread public encrypting system for encrypting E-mail communications.

He is in the process of being indicted right now by the Justice Department for exporting this technology as a munitions, which is the equivalency of supporting, say, (inaudible) produce their weapons.

Based on your record and the handling of the (inaudible) case, I believe that as you have a strong interest in protecting people's civil liberties, so I'd like to question, how can the Justice Department go after one of our prime spokespeople for practicing Internet?

SCOTT CHARNEY:

I'm going to answer your question in a little of a roundabout way, and the reason for that is you'll notice in my opening comments, I did not mention the Baker case either. The reason for that, the same reason I can't discuss the particulars of the Zimmermann case, is because they're pending cases, and I am prohibited from discussing them.

The issue of cryptography generally, however, is a fairly complex one.

On one hand, we do want people to have strong cryptography so that they can protect their own privacy, not only for their own purposes, but because hackers have been known to wiretap phone switches, intercept other people's communications and steal their data.

So you're back to wanting strong cryptography on one hand.

On the other hand, just like the anonymity problem, there are people who will use cryptography for bad things.

In this context, in the encryption context, the U.S. has two distinct but significant interests. One is that law enforcement not be thwarted in doing their investigations because data's encrypted, but the U.S. also has a broader national security interest.

If you look, for example, historically at World War II, our ability to break the German ciphers and the Japanese code was critical to our successes in the war. And if you don't believe me, you can read Body Guard At Large by Anthony K. Brown, which is probably the best book on intelligence in World War II.

So we have this problem that on one hand you want the proliferation of strong encryption, but on the other hand, if strong encryption gets in the wrong hands, you shoot yourself in the foot.

It's a very difficult issue to resolve, because the values on both sides of the equation are legitimate.

I mean, you can take something like murder and say, that's obviously bad, so people who are against it are good, people who are for it, are bad. That's easy. But cryptography is much harder.

The people who support cryptography, the use of cryptography, includes not just civil libertarians and privacy advocates, but the government. We encrypt a lot of our data for the very reason that we're protecting the sanctity, integrity and confidentiality of government information.

On the other hand, we had the problem that if cryptography is misused or used by people doing bad things, we end up in trouble.


PROFESSOR LOWENSTEIN: Another question over here.

AUDIENCE PARTICIPANT:

My question is directed primarily to Professor MacKinnon, but I guess I'd be interested in other member of the panel commenting as well.

One can argue that the primary reason that Jake Baker is in jail is he made the mistake of using his own name, indeed. A fair majority of the material out there is sent anonymously, and the current state of the technology is such that there's virtually no way to guarantee who sent the message -- sending it anonymous or various anonymous revealed is commonly done.

If that indeed is the case, indeed there is no way to really determine who sent this, and indeed you believe that nevertheless this material is pornographic and causes harm, what is your proposed remedy for the vast majority of this where there is no way to determine who actually sent it? Indeed, as the Exon bill was mentioned, is the only solution in that case to penalize the carriers of the material? And, indeed, what are the implications of that?

PROFESSOR MACKINNON:

Like everyone else, I'm in the process of both learning about the technology and thinking through the very specific questions that it raises as well as trying to connect that with everything else that we already know about these issues.

Mr. Steinhardt has no way of knowing how many real women are in the pornography out there that is already there. I know a lot of real women who are in pornography. It includes, in words only pornography, and in visual pornography.

So there is no way of saying, to your point, how many -- you know, if this is an unusual case or not. There are a great many ways in which it is not unusual. That is one about which I know a great many other examples and it's hard to know how many, you know, Lisa's and Jenny's and Mary's that are in these materials aren't real women.

You know when you see your real name in it, even if it's only your first name, you might be forgiven for thinking that it has something to do, at least, with a woman, and maybe you.

Now, the matter of anonymity. I'm attracted to what Mr. Charney described as the confidentiality solution, but I don't want to give this out as a position. I think it's hard to figure this out, but my own disposition is that absolute anonymity, while there are important circumstances for it, just as an example, people who have been sexually abused being able to talk with each other about that abuse.

It seems to me that creating a mechanism for speech, in which, by design, accountability is impossible is a real hazard. And at the same time I'm extremely bothered by the privacy issues that are raised by the fact that you can just get access right now, to what anybody's reading, if you want. You can just go up there, put in a name, and find out what they're reading on the Net. I think that's really very disturbing.

So I think that the confidentiality approach in which anyone who's on there, if it becomes necessary in some way, should be able to be, at some point down the line, found out or gotten to or identified, is an attractive solution that doesn't really necessarily solve anything because, you know, people will use false names, they will create false identification and so on. But that's always true, and so it's more a question of a basic design, and that's just how I see it.

I think I'm against anonymity, but I'm for a form of confidentiality that allows for privacy that should be available to people, while not insulating them, ultimately, for accountability for harm that they actually do through this medium just like any other.

PROFESSOR LOWENSTEIN: Let me just ask Danny Weitzner who wanted to add something to that question.

DANIEL WEITZNER:

I just want to say a few things about anonymity. I really do think as a fundamental issue that we all have to have the right to anonymous communications. There are circumstances in our political lives, in our personal lives, in all kinds of situations, where I really think it is critical that we be able to communicate anonymously, not just confidentially, because the people that have the power to move from something that is anonymous, so something that is confidential, to something that is, in fact, identifiable, may be the very people from whom we really need the protection of anonymity.

That being said, I think as a practical matter anonymity is some what of a red herring. I think the real disputes about anonymity now, are in some sense a function of the fact that life on line, if you will, is not very well formed. The moment that people start engaging in financial transactions on line, that the practicalities of maintaining anonymity will decline substantially. I think is should always be possible to communicate anonymously and completely privately. I think that that capability exists in the physical world and it should exist in cyber space.

But I think that my somewhat pessimistic view is that the IRS and the banking community are simply not going to sit by while perfectly anonymous financial transactions are allowed to occur.

(Laughter)

It's just not going to happen. People talk about concepts of digital cash, which I think are wonderful, but I think there will be real limitations. We might get to a point of confidentiality in that sort of financial transaction, but I can not ever im agine wide spread anonymous financial relationships.

I think that this is a direct implication for issues about pornography because my sense is that while the Jake Baker case is not a case that involved a commercial transaction, that involved the sale of that material that the bulk of sexually explicit ma terial today is in the commercial realm. To be in the commercial realm and be anonymous, I think is going to be very, very hard either for a buyer or a seller.

PROFESSOR LOWENSTEIN: Virginia Rezmierski, you had a comment about this anonymity question?

VIRGINIA REZMIERSKI:

I agree that we're in an environment where this kind of decision about whether anonymity should be allowed is a difficult one. Where you're working in a community that has to have accountability at some level for basic trust to be formed between individuals and for transactions to go on, it would argue from my way of thinking against anonymity.

And I would even say that there are arguments beyond that that would argue against anonymity as a technological capability, and that is that if we, in fact, are working towards being a community of reasoned discourse, where we're able to speak up and be accountable for our thoughts and be willing to stand toe to toe with people who have different thoughts in order to work these out.

We don't want, in my view, to encourage the use of anonymity, but instead a higher level of discourse, which is being accountable for what you're saying and what you're thinking, and being able to reason that out with other people.

Now that's in a world of equality, and I realize that we don't have that in all spheres but I'm arguing that we need to be trying to build, at least a university, towards that goal, rather than encouraging the ability to do acts without accountability.

So I still am leaning on the side of not having anonymity. And I think there's a reason for it.

PROFESSOR LOWENSTEIN:

Before I let you go, Virginia Rezmierski, there's another part, I think, to the questioner's question, which was: should there be greater burdens on the carrier? Now you are here representing a very big carrier, and what do you think about that?

Should there be, with all this accountability, should there be more accountability on the part of an entity like the University?

VIRGINIA REZMIERSKI:

Now I have to speak from my own personal view on this. There isn't a University position on this. I have written that the minute we start to move over towards the carrier having responsibility for what's going across the Net, we do two things, we move towards dis-empowering the individuals to take responsibility for their acts, and as you can tell, I come at this from a desire to empower, and to maintain empowerment of individuals. And so that would be against that argument. And secondly, the minute the carrier has to carry that responsibility they have no alternative but to move towards monitoring. And when you move towards monitoring, you are, in my view, on a very fast slope towards the destruction of liberties.

(Applause)

PROFESSOR LOWENSTEIN: Scott Charney.

SCOTT CHARNEY:

There's one more thing you have to think about in this anonymity debate, which is indicative of how complex these issues are, every time we talk about anonymity, we talk about the right of privacy of the sender, however, a fundamental precept of privacy is the right to be left alone.

We have been dealing with many cases where people generate 8,000,000 E-mail messages per minute and so flood someone's mail box, that they are forced off the Net. So the question is, what happened to their right of privacy.

PROFESSOR LOWENSTEIN: Did you have something you wanted to say, Barry?

BARRY STEINHARDT:

Yes. I wanted to say something about the carrier issue. You know, as a practical matter it isn't possible for carriers, conduits like the University of Michigan, to monitor the content of what goes over their network. I mean think about it for a moment. There are millions of postings to the Internet, it's simply not possible for the University of Michigan or America On Line, or even in most small bulletin board providers to monitor the content.

The consequence of making carrier liable is that you're going to force those carrier to greatly restrict the access to the kinds of sources of information that they are willing to carry. So, for example, we've had lots of cases of universities concerned about their liability across the country, the most prominent of which was Carnegie-Mellon, which cut off access to the alt.sex groups, news groups.

Well, whether or not you have the same view as Professor MacKinnon about the question of sexual explicit speech and pornography, never the less, the alt.sex groups for example, include alt.safe.sex, which is a discussion about safe sexual practices.

It's that kind of very broad brush that's going to have to be used to restrict speech, if you're going to start making the University of Michigan and other Internet and interactive media providers liable because there simply not going to be able to monitor. The only alternative they will have is to cut off access to anything that appears to be dangerous.

PROFESSOR LOWENSTEIN: Catharine MacKinnon, you have a comment?

PROFESSOR MACKINNON:

Yes. I would just like to take this moment, at the repeated invocation of my views on pornography to correct the misrepresentation of them. It's interesting that one can have a discussion about cyber space without people lying about other people's views. However, when one talks about pornography, it does seem unavoidable.

My person is not that speech which puts women in positions of display should be criminally prosecuted. My position is, first of all, that there should be civil remedies, resonating in sex equality when harm through pornography can be proven. That's civil as opposed to criminal, that's one misrepresentation of my view, a significant one.

And the other is that I use a definition of pornography that is sexually explicit materials that subordinate women through pictures or words that, also, the word that was missed here was also, include a list of one or more of the following. On which list, after you have either, coercion, force, assault, defamation, or trafficking in subordination, you must then, in addition, as one option, have women presented in postures of submission, servility, or display.

Now you will note that that's very different from what my position was represented as being. While that seems not to be able to be clear in this country, it is very clear in Bosnia, particularly among my clients who have been raped and pornography of their rapes, who have retained my to attempt to do something about it internationally.


PROFESSOR LOWENSTEIN: Question, here.

AUDIENCE PARTICIPANT:

This is a question for Mr. Steinhardt, but perhaps Ms. MacKinnon would also like to comment after Mr. Steinhardt. The Nation Magazine, a strong advocate of equalizing power between the sexes and a strong critic of Catharine MacKinnon's way of doing it, says we need to organize impassioned public censure; C-E-N-S-U-R-E, not the state censor, so when I refer to that, I include civil remedies and state Courts that are as fierce or fiercer than the criminal records. Doesn't the Internet create an exciting new opportunity, to use organizing to empower women and to protect all impressed with violence, not only violence of rape, but police violence, state violence, and work violence, and isn't the state interference with pornography being used to coerce accepting censorship of one form or another, who's end purpose is to protect the military industrial complex and trust, mass education, and organized democracy.

PROFESSOR LOWENSTEIN: Barry Steinhardt, do you want to try to answer that?

BARRY STEINHARDT:

I will try to answer that without making reference to the military industrial complex.

PROFESSOR LOWENSTEIN: Maybe you can summarize what the question was?

BARRY STEINHARDT:

No, I'm not going to attempt to do that, I'm just going to try to answer it, as I prefer to understand it.

(Laughter)

One of the things that we didn't talk about, I don't know if it's unique, but certainly distinctive about interactive communications, is that interactive communications give you the ability to respond to speech that you find distasteful or offensive or whatever, instantly, and to do it reaching exactly the same audience, through exactly the same medium.

For example, if you're a follower of a news group about the Holocaust -- this is a real example some of you may know -- and you read that someone out there is denying that the Holocaust, that the mass slaughter of 6,000,000 Jews ever took place, you have the capacity, by hitting that reply button, to respond instantly, and to reach the very same people and it costs you nothing more than whatever your, you know, minute-by-minute charge is from your particular provider, whether you're on one of the commercial providers or on the Internet.

That is very different, you read something you don't like in the New York Times, or the Detroit Free Press, or whatever. What's your remedy? Well you can try to write a letter to the editor. But there are lots of letters to the editor. Your chances of being published are pretty slim, and certainly you're not going to get the same play as a front page story got.

It really is very different out there, and it does empower people to respond to what they regard as bad speech with good speech, and I think we have to keep that feature in mind.

PROFESSOR LOWENSTEIN: Do you want to comment on that, Catharine MacKinnon?

PROFESSOR MACKINNON:

Your plug for the Nation made me feel really good about it. Better than I have in some time.

(Laughter)

I think it's clear, from what I said, that I think that the Internet, as such, does provide a marvelous possibility for communication, organizing, and speech among people who have had fewer resources than international corporations, whose speech is the speech that has been protected mainly to date. And, yeah, I think it's a great thing. The repeated use of the word, 'interactive,' I would just like to add something about that, and that is there's another thing going on, and that's the development of interactive pornography. Which I think does begin to raise some slightly different issues than we've seen in pornography to date. That is, interactive pornography in which the consumer, I think appropriately called the user here, can feed back into the computer what he wants to see and have the woman on the screen perform, or be seen to perform a whole series of acts and so on, in interaction with him and his demands. In other words, this moves it increasingly yet one more step closer to the real enactment of it on a person. It makes it more real. I'm not saying the other isn't real, I'm just saying it does make it -- that level of interaction with it poses, I think, the possibility as the technology improves of -- you know, I don't know why everyone else doesn't think this is the most interesting issue of the relationship between art and life that's ever been conceived here. It is, you know, it is elaborate. It is immediate. And the relationship between these materials and the living out of real life, I think, has the possibility of taking yet another step, you know, in the direction of art being life even more closely than it has before through this use of the word interactive in this medium.

DANIEL WEITZNER:

I just want to say your question prompted this thought that I think we're all in, you know, resounding agreement that this technology has the potential to change some of the significant power relationships that have shaped media, that have shaped expression. I think we all agree on that.

I do think that we are, in a sense, in looking at pornography as Professor MacKinnon would have us look at it where words act. I think there's something that is quite timely about that notion really in connection with a number of other issues that arise on the Net.

It is timely because not only are we, as Barry said, in a communications revolution, probably more importantly we're in an information revolution. And I always hate that term, but there is a little bit of -- there is, I think a lot of truth in the notion that our economy is increasingly information based. Well, information is words, is speech.

The problem that we are facing in a number of areas in the encryption arena that people have brought up is that speech is also becoming property. What some people would see as discourse and dialogue involving the words and information of others, can also be construed as theft and misappropriation of that property.

So we have, I think, blurring lines here between speech, property and action that I really think we have to start sorting out.

It is not simply enough to say that everything that goes on the Internet is discourse and it is by definition protected by the First Amendment.

This may be sacrilege, I'm a little uncomfortable about saying it, but I think we have to face it. I really think we do.

That if we're in an information economy, if the terms that somehow bind and mediate our society are increasingly information, that can't all just be absolutely protected under the First Amendment. We're going to have to start sorting out these lines.

I happen to think that pornography may be a particularly bad case to try to sort out a lot of those lines.

(Laughter)

DANIEL WEITZNER:

But I think it's an instructive case. I do.

PROFESSOR MACKINNON:

It's a particularly bad case because of what it makes you need to think, right?

DANIEL WEITZNER:

Well, I'm not sure why it's a bad case.

PROFESSOR MACKINNON:

In other words, I'm suggesting it makes you need to think that something has to be done about it if you go down that road, or does it not?

DANIEL WEITZNER:

No, no, no. It's because I'm not sure that what has to be done about pornography is the same thing that has to be done about speeches property or anything else.

PROFESSOR MACKINNON:

I agree with that. I would agree.

DANIEL WEITZNER:

I don't think that the solutions are the same, and I think if we try to have First Amendment rules that are shaped by a response to pornography, they're not going to be the same kind of First Amendment rules that are shaped by political discourse or by exchange of property.

I think we've got to sort all these things out.

My concern simply is that we seem to be sorting them out first, driven by pornography, and I don't know that that's -- which is not to say we shouldn't sort out that -- but we shouldn't extend that to these other areas.

PROFESSOR MACKINNON:

I completely agree that an analysis of pornography and the way it is approached is and needs to be kept very specific to it. It is a very specific kind of thing.

What I would just like to inject in this though, that is so amazing is, what is it about this cyber space that has permitted this discussion? That is to say, the discussion about pornography, about how it is that it can be that words do things. Why is it possible that all of a sudden people are seeing that here? You know, why do we need this screen around this thing for people to come to see that this thing that is going on in what is called RL? Why is it more real in VR than in RL -- put it that way -- you know suddenly?

There's an amazing article that I'd like to refer you all to by Julian DeBelle, which was in the "Village Voice," December 21, 1993, called "Rape in Cyber Space," and I made a brief reference to what happened there.

He says towards the end of this:

The commands that you type into a computer are a kind of speech that doesn't so much communicate as, in italics, make things happen directly and intellectually the same way pulling a trigger does. I can no longer convince myself -- this is Mr. DeBelle -- that our wishful insulation of language from the realm of action has ever been anything but a valuable kludge of philosophically damaged stop gap against oppression that would just have to do till something truer and more elegant came along.

The end of quote from him.

It is thinking about cyber space that led him to see that words do real things.

Of course, the next necessary question is, you know: if they do that in cyber space, why don't they do that in the rest of the world?

PROFESSOR LOWENSTEIN: Virginia Rezmierski, you had a comment.

VIRGINIA REZMIERSKI:

Quickly.

The two concepts that are really important, it seems to me, that keep coming back to this and certainly the concepts that show up in so many of the incidents that we're dealing with are the issues of intrusiveness, and certainly the technology is intrusive. It intrudes on my desk every morning with a hundred E-mail messages in every minute later as I have to use it to do my scheduling over Meeting Maker, or whatever. So it has intrusive components in and of itself. But what people can do with the technology is also intrusive. So one concept is intrusiveness.

Another concept is a concept and the importance of empowerment. I think that when we start talking about electronic access to pornography we mix some things.

We mix the issue of access and our rights to access or not access material.

The position of the university has been that we will not censor -- that the university community has the right to access anything it wishes to access for its purposes.

Now, put access aside for a second, though that's not altogether an easy thing to do in these circumstances.

The second piece of this that always gets mixed in with when we talk about pornography is what people get to do with the stuff they have accessed. And these are the kinds of incidents that we are experiencing.

An individual will access pornography, or something that someone else may find to be offensive, and then send it to their electronic mail or dump it into their class account so that people are unwittingly exposed without making any decisions that they wish to access that material. Their rights to make an access decision are taken away by the person who exercised their right to access it.

The material is dumped on a common printer, so you go to pick up your material, you have to be exposed to the material that someone else accessed while they sit back and watch their intrusive act on your behavior.

So there are big issues that we need to deal with at a university, it seems to me, and that is, how to empower individuals to make determinations about their boundaries and to be able to say: you don't have the right to exercise your rights on one side and take away my rights on another.

I think it's important for us to keep these two issues separate. Access is a really important one when it comes to censorship or non-censorship. But equally important is the issue of my rights to make a decision about my work and personal space. And that's an empowerment, an issue of limiting the intrusiveness of both the technology and other individuals on me.

(Applause)


PROFESSOR LOWENSTEIN: Question over here.

AUDIENCE PARTICIPANT:

I have a question about (inaudible) to whether people had a right to being anonymous on the Internet.

It seems that there's an opposite thing going on, which is it's very easy to impersonate somebody else.

I have a scenario here which I just wondered what you all thought of, and it kind of deals with pornography issues.

If someone writes a program that is trained to read alt.sex and learns how to write these stories, I was wondering, first of all, who is liable? The author of the program or the author of the stories that the program has been learning from?

My second question is, how do you -- I mean, then we'd be censoring a program, and in essence that program has taken the form of a person on Internet and how are those issues going to be dealt with? I mean, the program as an actual entity which people on the Internet will see as a person, does that have rights, or how would you send (inaudible) (coughing)

DANIEL WEITZNER:

I really have to say, I think programs don't learn. Programs can be made to function in a way that mimics what we think of as learning. I think that people write programs to do certain things.

Now, you may have, you know, standard of care issues about how careful you have to be when you write a program and the degree to which you can foresee the consequences of the operation of that program, but I really don't ever want to get to the point where we have to start talking about the rights of computer programs.

(Laughter)

DANIEL WEITZNER:

I really don't. We can't talk about rights anymore, if that's what we're talking about.

PROFESSOR LOWENSTEIN: Barry Steinhardt, you had a comment.

BARRY STEINHARDT:

As some of you might know, we have a rights of series. I was wondering, rights of computer programs.

What I actually wanted to do, what your question sort of reminded me of was the earlier question about encryption technology, specifically about public key encryption, about the Pretty Good Privacy program that Phil Zimmermann is the author of.

There's a flip side to this encryption issue, which is, not only is encryption used to keep information private but public key encryption is also used to verify authenticity. And the suppression public key encryption is an issue for -- in commerce where we want to be able to verify that this was, in fact, someone. You know, this was in fact Joe Jones' credit card and Joe Jones' authorization to buy whatever.

So that there's a real risk that we run by suppressing encryption technologies that we're going to deny American industry and others the ability to verify the authenticity for communications.


PROFESSOR LOWENSTEIN: The woman in red.

AUDIENCE PARTICIPANT: . . .

But what I really wanted to know about, you've got people like Scott Charney, I know that the police have people being trained in the computer theft and things like that, because most (inaudible) people's telephone maybe a place of names and addresses, because there's (inaudible)

You've got politicians deciding on the law, just like the S314 . . . [but they] don't know what it is they're deciding.

Is there any way that we can require that people know what they're doing?

(Laughter)

(Applause)

PROFESSOR LOWENSTEIN: Danny Weitzner, that's sort of your job, isn't it?

(Laughter)

DANIEL WEITZNER:

Right. I can tell you what we're doing specifically in response to Senator Exon's bill, which I should say, not only bans the creation or transmission of obscenity, indecency, but also lewdness, lasciviousness, filthiness and annoyance.

(Laughter)

DANIEL WEITZNER:

That's also my job, you know.

What we are trying to do with the help of some of the companies that produce this technology is really just to set-up demonstrations so that we can bring a little lap top into Senator Exon's office and say, "Here, see, it's not quite what you thought it was."

I do think, though, that there is a broader experiential gap, which is that just seeing it isn't having the experience of using it. The only thing I can say -- I mean, I think this is a broader problem with the Congress and the political establishment. As we know, George Bush didn't know what a supermarket scanner was. That's probably more serious, had more serious implications than not knowing what an E-mail address is.

So there is a broader problem. It's safe to say that there will probably be some unthinking laws made. But I think on the encouraging side, the people who use this technology really are getting better and better at making their voices heard.

There are about 80,000 people who signed a petition on the Internet against the Indecency Bill. We have been working with the ACLU and some other groups to encourage people on the Internet to learn more about Exon's bill and to contact their senators, and that is happening at a substantial rate.

So, you know, you have to do this piece by piece, person by person.


PROFESSOR LOWENSTEIN: More questions? Let's see, we already got to you. Over here.

JOSH WHITE:

Hi, my name's Josh White of the Michigan Daily. This is for Professor MacKinnon.

You mentioned that the civil remedies for ways we take care of the attacks of women on the Internet. Mr. Steinhardt mentioned the possibility of seeing how civil cases could be brought out of this, and Professor Lowenstein mentioned in your introduction that you were representing the woman who was named in the Jake Baker stories.

I was wondering if we could expect a civil case to come out of what has happened to this woman?

PROFESSOR MACKINNON:

Currently, this woman is being represented by the United States in a criminal case for threat. She's being represented by the District Attorney, who is her lawyer.

There is currently no thought being given to any civil action on her behalf.


PROFESSOR LOWENSTEIN: Question over here, in the tie.

(Laughter)

PROFESSOR LOWENSTEIN: That makes you unique here.

AUDIENCE PARTICIPANT:

I'm Kirk Gavin from the Michigan Computer Crimes Task Force and I trained cyber cops down with the federal government since (inaudible), as a matter of fact.

One of the issues we always have is the explosion of the traditional legal jurisdictional boundaries. We have fifty states with fifty different laws and they interact.

We have talked recently about the California couple that just drew two and a half year prison terms in Tennessee for operating a bulletin board in California where pornography was downloaded into Tennessee and they are now in a Tennessee prison. I'm wondering, is there going to be a federal preemption of the state jurisdictional laws concerning computer fraud and concerning pornography, because in very much reality, Mr. Baker has exposed himself in every single state to possible obscenity violations. There could be prosecution.

So does anyone have any thoughts on that? I'd like to tell these cyber cops that we could --

PROFESSOR LOWENSTEIN: Let's start with Barry Steinhardt.

BARRY STEINHARDT:

Well, I think we're going to have to completely re-examine notions of jurisdiction here.

I mean, if you take the Amateur Action case, which is the case you referred to in Tennessee, where there was a bulletin board in California and the bulletin board is accessed from Tennessee by, I think it was a Postal inspector. The crime is --

DANIEL WEITZNER:

In his official capacity.

(Laughter)

BARRY STEINHARDT:

Right. Well, he says in his official capacity.

The crime is interstate transmission of obscene material.

Well, it raises some very interesting questions. Not only does it call into question the Miller test for obscenity in which community values you apply here, but it also calls into question of whether it was even transmitted or whether it really wasn't the equivalent of his going to California, this case through cyber space, traveling through cyber space rather than physical space, getting the material, bringing it back to Tennessee. Those are all very complicated questions.

You know, we've been following the state analogues to the Exxon bill in various states.

One of them was in Oklahoma. In Oklahoma there's a bill in the legislature that would require that all bulletin boards, which is not a term that they defined, but it apparently would seem to sweep in everything from Internet providers to CompuServ to these little private bulletin boards, must display a notice that it is illegal in Oklahoma to transmit obscenity or pornography, which, of course, the later part not even true.

But I don't know how it is that international providers could ever -- let's assume that Oklahoma passes that bill and that forty-nine other legislatures pass bills with different requirements, how could anyone ever possibly be expected to know about or follow all of them?

We're getting into very complicated and uncharted waters here, and there probably eventually will have to be some federal preemption because of that.

PROFESSOR LOWENSTEIN: Professor MacKinnon.

PROFESSOR MACKINNON:

Just as to your remark about obscenity law.

Obscenity law, as I'm sure most of you know, is indifferent to whether an individual is named or used or not. In other words, it doesn't make it more obscene that someone is named.

It's also entirely indifferent to violence. That is no part of the obscenity test.

I'm interested in your assumption that the Internet pornography -- is pornography meaning someone is shown being killed in words, would be obscene in fifty states?

I suppose maybe the question is, if it wouldn't be, what would be?

But I think the conclusion is equally available that very little is.

The other thing that is really worth noting here is that even those very few things which have ever been found to be obscene, are available on the Internet now.

In addition, wholly apart from obscenity, we do have laws against child pornography in this country that flow from the use of children to make it.

Now, presumably, although no one's ever mentioned this, that might make a difference -- that might make a distinction between the use of a real child's name.

In addition, although child pornography is criminal in this country in a way that there has been actually some serious enforcement -- unlike obscenity -- with some consequences, it also is available on the Internet.

So that it's just important to keep in mind that as to this business of pornography, you've got obscenity laws. Even those few things which are found to be obscene are very much available now on the Internet. You can't get it over in Ypsilanti. You can get it on the Net.

The same with child pornography. You can't just easily walk in and get it, although any man who wants to get it can -- anyone who wants to get it can -- but you can easily get it on the Internet.

DANIEL WEITZNER:

I just want to on the one hand agree with Barry that the whole community standards doctrine is really, I think, faced with a critical challenge that the whole purpose there of enabling local communities to have some local control while not creating a national standard is clearly under challenge here.

But I actually think I want to disagree with the notion that there ought to be preemption, especially to the extent that we're talking about criminal charges.

One of the things that I found disturbing about the Baker case is that the FBI was involved at all. As far as I can tell, to the extent that this was a crime, this was a local crime.

It may mean that some states have to go back and look at their harassment statutes and look at their threat statutes, but I really believe very strongly that to the greatest extent possible, criminal law ought to be state matters. The FBI is totally unaccountable to local communities. I think clearly local police need to learn more about this technology so they can behave responsibly.

But I am absolutely opposed to the notion that we should federalize all of the crimes that occur on the Internet simply because they go across state lines.

The original statutes regarding harassment with telephones were put into place in 1968 at a point when people were using long distance telephone lines quite a bit more and there was a perceived enforcement problem and there was some evidentiary problems.

I don't see any of those kind of problems in the Baker case or in a number of imaginable harassment incidents where they're local crimes. I think they should remain that way.

With respect to Scott, I just don't think, you know, the federal government should be involved unless there's a clear reason.

PROFESSOR LOWENSTEIN:

Let me ask Scott Charney.

Do you want all those cases?

SCOTT CHARNEY:

I don't see preemption for a host of reasons.

First of all, there are sovereign issues with the states that have to be considered.

Secondly, the federal government doesn't have the resources necessarily to do all these crimes.

Third, you know, there are cases that have interstate implications, and, of course, the federal government is better suited to deal with some of those cases.

As for community standards, one of the things I think we also have to remember is with the Internet being global, are we going to the lowest world wide common denominator as the test? I don't know.


AUDIENCE PARTICIPANT:

There were just some presentations on, talking about, in the FBI affidavits for a search warrant, who went into some of Mr. Baker's discs, with the assertion of certain few case of the gentleman in Yorba Linda, California who became enamored with Mr. Baker's communications, perhaps his distortion, and sent a video tape, according to the affidavit, of a seventh grade girl, who requested that Mr. Baker write a fantasy for him about that girl, which he then did according to the affidavit.

So you do get some interesting aspects as well as children and other things (inaudible)?


PROFESSOR LOWENSTEIN: Question over here?

AUDIENCE PARTICIPANT:

Yes. I would like to address Ms. Rezmierski.

One of the points that you raised, that you and I have agreed on in the past is to; as a matter of fact, the very first point this evening was "to prevent the reactionary institutional control", particularly emphasizing, "preventing the reactionary."

I'm a correspondent for radio station, RFPI in Costa Rica. We have an Internet address of graymatt@cyberspace.org.

I wondered what the . . . why the University reacted as reactionarily, as it did in the Jake Baker case, when your and my efforts seem to know that there have been true facts of confirmed stalking, confirmable staff promotion 'fixes', confirmable bomb threats, confirmable gratuitous e-mail openings penetrating into the Office of Management and Budget. That's an unfortunate part of history here.

Why did the University choose sex and violence in a virtual place instead of the reality of what has been delivered through your office to the University? To do something reactionary and to keep somebody in jail without bail?

VIRGINIA REZMIERSKI:

When we receive a complaint of any type, as you well know, we investigate that and determine what action is required. We did the same in this case.


PROFESSOR LOWENSTEIN: Question right there?

AUDIENCE PARTICIPANT:

Yeah. Professor MacKinnon has stated that we now (inaudible) communication that's like any other communication speech and in a sense that it should be protected unless there is a show of harm. And there we have a show of harm and what do you interpret as show of harm.

They make efforts to show this real threat and the (inaudible) of what may be a real threat and that he threatened to carry out a place to be, et cetera, et cetera, to show this.

But then, you admit that, no, it's not just this issue of, is it a real threat but that you would be opposed to (inaudible) to display an expression of anything that shows women in this submissive position, and again, what is this interpretation, your interpretation, of submissive position, for example? Which were quite consensual, perhaps.

So I think you know (inaudible) try to show that, no, don't the issue -- the issue of is it a real threat, the details of the Baker case (inaudible).

But then it turns out it's really a broader issue. It's broader even -- Professor MacKinnon, you were actually, as I understand it, last year instrumental in suppressing a student in exhibition, and this particular entry showed and depicted various aspects of daily of life and the exploitation of, in the life of a prostitute, in which (inaudible). I think, the point is, the reason why it's not an issue of so much what are the details of this case, and what's the outcome, because I think that this (inaudible) that you were lucky enough to find someone who used a real name, who talked about making plans, who was foolish enough, in my opinion, to give his actual -- I mean to give up his privacy rights; hand it over to the FBI.

So I think that this stance shows, at very best, a naive faith in the government and a willingness to hand over the government, the state, the right to judge which speech should be suppressed and which speech isn't. As though this were some (inaudible) state. Now this is the state which carried out the McCarthy witch hunt, as you well know, FBI assassinations of civil rights, and other activists, assassinations and invasions overseas, it's right now, today, as well, moving to judicial code --

PROFESSOR LOWENSTEIN: I want ask you if you can narrow this down, if you have a question for somebody.

AUDIENCE MEMBER:

I was first going to ask, would your position really be different if there were no actual name mentioned, no supposed real plan of the abduction, but you, yourself answered that, so I will ask, what better hope is there for getting us (inaudible) and supporting the policy of suppressions of speech. When things begin with such a clearly compulsive (inaudible).

PROFESSOR MACKINNON:

I appreciate the chance to repeat myself and straighten some of this out. Although, I must say I'm not real optimistic about doing it. I said a showing of harm, that means that harm needs to be shown. I didn't say a show of harm -- showing of harm. The structure of what I said began by talking about, yes, indeed this is a threat. It is.

I then moved into saying other things that it also is. Now if it's possible to hold two thoughts in mind at the same time, they would be, first, that it is a threat, second, that there are things in addition to that that it is, a lot of which involve the pornography. That doesn't mean the pornography is not a threat, it simply means that the E-mail communications make very concrete the threat which is both contained in the pornography and the other aspects of the pornography then go beyond that threat.

Now, I was not instrumental in suppressing an exhibition. That isn't what happened. A complaint that there was pornography under the sponsorship of a student group here was made to me. I communicated the complaint to the student group, and told the student who asked me, 'How shall I get back to you on this?'

I stated, 'You are not accountable to me on this.' That student, together with others, went off and made a decision -- the only other thing, by the way, that I said to her was, 'Did anybody look at this before you put it up under your sponsorship?' She said, 'No, not that I know of.'

So the other thing I did suggest to her was that someone should look at it. That's what I said. Now I think that's generally considered speech, you know, when people look at things that are and consume it and so on.

I do find it fascinating the way pro-pimp forces characterize reality as a lucky break for people who seek to do something about pornography. Unfortunately everyday, there is yet another lucky break for us. That is to say, yet another woman being violated in very real ways through this stuff.

It's not a lucky break. It's reality and it keeps happening. And it keeps finding the people who haven't been able to be shut up on talking about it. How about that?

Now the only luck thing, I think, about this, aside from the fact that people are taking it seriously, which I don't know what to attribute that to, but I guess luck is as good as any, is that we do know so much about this man. That, indeed, does make this somewhat exceptional. Not entirely so, but somewhat so. And knowing as much as we do about him, by his own permission, has made it possible to fill out the picture that usually we are left imagining. In other words, usually we say, "Boy, I wish we knew more about this guy," what he really thinks, et cetera.

People relate to cyber space as putting in a lot of things that we have always known, or thought about, imagined, planned. If there's one, there's two, three, four, probably, but we don't know it. If he's thinking about it, maybe he's talking about doing it. But we don't always know that.

So, you know, the lucky break here, really, I would say, to the extent I would agree with you about that is that we know as much as we do about this piece of real life.


PROFESSOR LOWENSTEIN: I think we owe at least one question to somebody from the Law Review. Let's make this the last question.

AUDIENCE PARTICIPANT: I'm not a member of the Law Review.

PROFESSOR LOWENSTEIN: Sorry. You were sitting in their place. I'll give you the question anyway.

AUDIENCE PARTICIPANT:

I'm Vince Keenan representing the Michigan Student Assembly Students Right Commission.

We were talking earlier about the possibility of taking out, say, alt.sex stories and actually Mr. Steinhardt said that the alt.safe.sex falls under the category of the alt.sex categories.

I'll say as somebody working on the Students Rights Commission, when this story hit, boy, I became a regular of alt.sex stories and catalogued all the responses of all the people on alt.sex stories about Jake Baker's response -- about Jake Baker.

I don't know how often that's going to happen, that one of these groups is going to become sort of the hot bed for civil liberties discussion, but I would say as someone who is very concerned about happenings in the community, it was an invaluable resource.

Another thing that happened in terms of that particular group was that the group I represented worked very, very hard to try and keep -- we wrote hundreds of E-mail responses trying to keep the original story from being posted again.

We had contact with people that we would normally not want to have contact with, but they said, "We have the story, should we re-post it?" And we said for a number of very legitimate reasons, that not the least of which is the security and safety of the individual in question, "Don't re-post this."

And we actually did, to a certain extent, keep it from being re-posted, I believe, for a week and a half or so, until after they arrested Jake.

So I'm concerned about the idea of censoring, or the possibility of even taking out these groups, from the University community?

Let me bring this to a question real quickly.

(Laughter)

AUDIENCE PARTICIPANT:

Just in general, it did bring up one of the thoughts in my mind about, how pornography and obscenity, I mean, completely independent of -- I mean, if I can ask this question independent of this sort of moral and legal implications of it.

It certainly has provided the drive for the development of some of this technology. In the weirdest way, we have (inaudible), you know, everyone has VHS and video tape, nobody has Betamax or base disc because the licensing agreements that allowed x-rated movies to be done on VHS. I don't think (inaudible) can say that VHS can't be a learning medium.

And my question is, maybe someone could comment, how important is pornography and obscenity, you know, in the fringes, and these people have spent up all night talking about, you know, this stuff and writing about it, writing programs for it, how important is it for the development and the future development of the Internet?

DANIEL WEITZNER:

Do you have a view? I mean, I'm curious.

AUDIENCE PARTICIPANT:

I guess -- I mean --

(Laughter)

DANIEL WEITZNER:

I didn't mean to put you on the spot. I mean, I'm curious. You obviously have thought about this.

AUDIENCE PARTICIPANT:

It's changed since Jake Baker has started.

I mean, like I'm amazed at how much stuff of this nature is out there. You know, I mean, like the percentages. I mean, like, it seems to me that it's so big right now, but if you took out that chunk of it, you know, would it continue to expand in the way it was useful.

I mean, once you can upload a pornography picture, you can also upload the Mona Lisa.

So I'm concerned. I mean, like I don't know what the answer is.

PROFESSOR LOWENSTEIN:

Well, maybe we can ask Danny Weitzner. Is this concern with obscenity and pornography a big part of what your organization is concerned about?

Are we making a mountain out of a mole hill, or is it really something that is generating a lot of both the questions and the development of the whole technology?

DANIEL WEITZNER:

Well, I don't think this is a mountain out of a mole hill.

I do think that, there are, probably the most important dimension of this technology is not the technology, but the social and cultural adaptations to it and with it.

I think, this is not just a concern of my organization, but it should be a concern of all of us that we somehow use this medium responsibly. I don't believe that law has all or probably even most of the answers to do that and responsibly. This is not just about whether you put sexually explicit materials on-line or not. It's about how people behave and what sort of social interactions develop.

I detected in your question a suggestion that somehow it's really pornography that's driving the advancement of this technology. And that I really don't think is true.

People have said that about the VHS market. I don't know enough to know whether that's true or not.

This technology developed may even be worse. This technology, in fact, developed for the Defense Department so that they could have some secure networks that survived nuclear explosions.

So, you know, put that into the mix here, and I don't know, you know, it's Nazi doctors.

But you know, here we are with it and I don't think there's an issue of that going back.

I think that to the extent that I have a major concern about the interaction between this issue and the development of this technology, it's really the issues.

(Lights turned off)

(Laughter)

PROFESSOR LOWENSTEIN: Ought, oh.

DANIEL WEITZNER: How am I supposed to take that.

(Laughter)

DANIEL WEITZNER:

Usually they flick on and off. It's a little more subtle.

It's really the issues about that Barry raised about carrier liability. If the University of Michigan and America On Line get held responsible for policing all this, then we're in real trouble. That is the issue that I would say I have the most worry about here.

PROFESSOR LOWENSTEIN:

Let me just ask Virginia Rezmierski a kind of specific question raised by the student who asked about access here.

Has the University of Michigan considered limiting student access to these kinds of networks, like the alt.sex area on the Internet?

VIRGINIA REZMIERSKI:

Yes. And decided that it was against the values of the University of Michigan to do so.

(Applause)

PROFESSOR LOWENSTEIN: But Barry Steinhardt wanted to talk and then we'll follow up with Professor MacKinnon.

BARRY STEINHARDT:

Two things.

First I want to commend the University of Michigan for that, and particularly, Virginia, for making the distinction between the issues of access and what you do with that information once you attain it.

I think that is an important distinction.

I also wanted to just follow up on something that Danny said and follow up on your question.

I don't know whether sex is - or sexually explicit material -- and I would reject the term pornography, because I don't know what it means -- but I don't know whether it's driving the development of the technology or not.

My real concern, though, is that the remedy, is the almost obsession that a lot of people have with the regulation of sexually explicit material will result in remedies that are going to severally restrict the content that's available.

I mean, to use your example if the University of Michigan finds itself in a position where it may be liable for having sexually explicit, indecent, lascivious materials it may have to not ban all sex but it may also have to ban, you know, the World Wide Web page for the Louvre, because there are all kinds of what many people would regard as classic pieces of art, sculpture, paintings, et cetera, which some people find to be lascivious.

That's the road that we're headed down if we're going to begin to hold the carriers liable for content, particularly when we begin to use vague terms like lewd and lascivious.

PROFESSOR LOWENSTEIN: Professor MacKinnon.

PROFESSOR MACKINNON:

Yeah. I mean, just before everybody's really hasty about their approval of access by students to pornography on the Internet, all of your fees are going to supporting this.

Suppose you knew that eighty-five percent of the activity around here on the Internet was pornography. Say a really big percentage. And that people were doing with it the kinds of things Jake Baker was. That is, say he's getting paid by the University on a work study or whatever, to sit at a computer to do something, and he's consuming alt sex stories and writing them.

I just sort of thought I'd inject that possible piece of reality for your consideration. In addition, there was a question that Virginia Rezmierski posed earlier and it was -- there were several of them.

Can you be harassed if you're not a recipient of it? Can writing itself be violent? Can you be targeted by something that is loving if you don't want it? In other words, if it's targeted at you and it is loving, can that be harassing and is this all appropriate use of resources?

I would say to this, can you be harassed if you're not a recipient? Yes, you can be, because things are said about you that surround you, that create your environment, that make what your world is whether you got it or not. In other words, it's targeted at you even if you didn't get it, and that creates an environment of sexual harassment.

Can writing itself be violence? I say, no. It can do harm, however, and it doesn't mean it is violent.

But this part about what it is -- is it an appropriate use of resources? I think that's this question.

Do you all want to be paying for people to do what Jake Baker did?

In addition, should the women's fees at this university go to supporting open access to materials that then go to creating a hostile environment for them -- in which interferes potentially with their getting equal access to the benefits of an education.

(Applause)

PROFESSOR MACKINNON:

As to whether pornography should be an access to which it should be funded at this University, I think that the consequences that that has putting the University in a trafficker position, for equal access to the benefits of an education for women here, should be part of that discussion.

What does it do to target women for rape and sexual abuse? How does it connect with the actual rape rate on this campus? I mean, I think that's the kind of thing that we should be thinking about.

PROFESSOR LOWENSTEIN:

We're going to have to wrap it up there with probably a lot more questions raised than were answered.

I want to just thank the Michigan Telecommunications and Technology Law Review for making it all possible.

(Applause)

MARK LONG:

I just want to encourage people who are interested in these issues to visit our Web page and continue to learn as we have tonight about these issues.

I'd like to thank the panelists for a fascinating discussion, and Professor Lowenstein for filling in at the last moment.

Thank you. Good night.

(Applause)

Symposia Archive