backbone transparent proxy / connection hijacking

performance for non-HTTP protocols. Many people believe that
large-scale caching is necessary and inevitable in order to scale
the Internet into the future.

Horse-hockey. Purchasing the proper amount of bandwidth from your upstream
and your upstream implementing the facilities to deliver that bandwidth is
what is necessary to scale the internet into the future. Over-allocation
is bad enough on the dialup-ISP side of the equation without backbone
providers doing the same thing. If your backbone provider doesn't have the
capacity to deliver what they have sole to you, it's time to contact two
entities: Your lawyer and another backbone provider.

The Cybercash server performed client authentication based on
the IP address of the TCP connection. Placing a proxy (transparent
or otherwise) in between clients and that server will break
that authentication model. The fix was to simply configure Traffic
Server to pass Cybercash traffic onwards without any attempt to
proxy or cache the content.

With web commerce companies popping up all over, who wants to be the one to
track these companies and their netblocks and reconfigure the proxy every
time one comes online or changes providers? This would be a nightmare for
the dialup provider, let alone their provider or their providers provider.
If we purchase transit from provider X because provider Y doesn't peer with
provider Z from whom Commerce Provider Alpha gets their connection, who do
I yell at when my customers yell at me? Sounds like a huge finger-pointing
match to me. Like we don't already encounter too many of those?

The second example was of a broken keepalive implementation in
an extremely early Netscape proxy cache. The Netscape proxy
falsely propagated some proxy-keepalive protocol pieces, even
though it was not able to support it. The fix was to configure
Traffic Server to not support keepalive connections from that
client. Afterwards, there were no further problems.

See above. Backbone providers are here to provide unhindered connection to
the internet for their clients. If someone wants to set up a proxy, it's
their business. Publish the address of the proxy and have at it.
Hijacking your clients, their clients, etc connections and shoving them
through a proxy cache to save money is unethical.

Transparent proxys have one place. On the CLIENT side of the connection.

These two problems are examples of legacy issues. IP-based
authentication is widely known to be a weak security measure.

IP based authentication, alone, may be somewhat weaker than we would like.
It is however much more secure than having one central point on Backbone Z
that all web based traffic goes through. IP based authentication is also
very nice augmentation for challenge/response and even simple
username/password schemes. It limits the usefulness of authentication
tokens that fall into the hands of those who do not possess the skill-sets
necessary to do blind attacks.

The Netscape server in question was years old. As time goes
on, there will be a diminishing list of such anomalies to deal

This is like saying, "Internal combustion automobiles are outmoded with the
advent of plentiful gerbils to power the vehicle on treadmills." Haven't
we all learned the backward compatibility lesson?

The second issue being discussed is the correctness of cached
content. Posters have suggested mass boycotting of caching by
content providers concerned with the freshness of their content.
Most content providers have no such concerns, frankly. The problem
of dealing with cached content is well understood by publishers
since caching has been in heavy use for years. Every web browser
has a cache in it. AOL has been caching the lion's share of

And we all know that AOL is the #1 choice of the uninitiated, a proving
ground until they graduate to a SERVICE PROVIDER vs. a Content Provider.

US home web surfers for years. For more information on the ways
in which publishers benefit from caching see our white paper
on the subject of caching dynamic and advertising content:

"Dynamic content heuristics
               In the event a publisher has not incorporated
               cache-control headers into their site, Traffic Server
               uses a number of heuristics to recognize dynamic
               Web pages (for example, pages generated using
               cgi-bin or form submissions). These pages are
               automatically refreshed on every access."

So, If someone is using site exec, etc in their code, and their
provider/webmaster/mother didn't set up Progma: nocache, they're
effectively screwed...erm...cached, right?

"Refresh algorithms
               For pages not utilizing cache-control headers,
               Traffic Server ships with a default refresh cycle of
               one hour for all pages. In addition, the cache uses
               sophisticated algorithms to selectively update
               rapidly changing pages at a faster rate. These
               updates are accomplished using an
               "If-Modified-Since" request, which saves bandwidth
               if the page has not changed. "

Fantastic. So, lets say I'm Joe Banner Advertizer. Company X has paid me
present their banner. They wanted to limit the amount of money they spent
so, they had me code my servers to only display their banner X times per
day since I bill them on impressions. Backbone provider Z installs one of
your boxes. By default, no matter how many connections on the
limited....erm...client side of the box are initiated to retrieve a "fresh"
banner from our banner-farm, you send them Company X until the cache times

"Users see improved performance because static
               graphics and applets are delivered from nearby
               caches, insulating them from congestion in the
               Internet cloud."

Let me rebut this quote with yet another quote from you:
"Every web browser has a cache in it."
I'll bet on the speed of the slowest LOCAL cache against one at the
provider any day.

And finally, there has been confusion concerning the
confidentiality and legal issues of transparent caching.
Transparent caching does not present any new threat to the
confidentiality of data or usage patterns. All of these issues
are already present in abundance in the absence of caching.

Oh really? Lets say that I'm Joe Internet Terrorist. (Hey, they thought
that the mailman was harmless too!) I work at backbone provider Z. I want
to create panic on a global scale. I insert a machine on the business end
of your proxy/cache that claims to be some corporate giant who has 80%
share of the desktop market. Now, any requests that your cache makes are
served up by my "local" box and not the REAL webserver. Just to make
things interesting, I tell the webserver to serve up ONE page no matter
what the request is for. On that page, we put something to the effect of
"We have declared bankruptcy. We will no longer support any product we
have sold you. You're screwed."

You're saying "It would be just as easy and probably more likely that
someone would put that box at Company X instead of at the proxy/cache."

Not if they also wanted to spoof the webservers of the FBI, CIA, etc. Now,
we have the entire internet community served by carrier Z convinced that
they're screwed.

"I couldn't resist Your Honor. It was just too easy. It was better than
broadcasting an alien invasion on the radio!"