DOCSIS 3.1 upstream

Canadian cable carriers seem to have all told the CRTC they can only
carry 42mhz in the upstream because their amplifiers and nodes only
amplify that narrow band in the upstream direction.

Is/was 42mhz common across north america ?

In a typical installation, how much of the 42mhz is actually usable for
upstream data ? (aka: how many 6mhz chunks are taken for TV STBs,
Telephony and whatever other upstream channels. I think I heard that one
6mhz chunk is used for one the low numbered TV analogue channels.
(although cablecos are slowly widthdrawing analogue TV now).

Am trying to figure out realistic bandwidth that a cableco with 42mhz
limits for upstream will get on 3.1.

Also, have cablecos with such limits for upstream begun to upgrade the
cable plant to increase the upstream bandwidth ? Canadian cablecos have
told the regulator it would be prohibitively expensive to do so, but
incumbents tend to exagerate these things when it is convenient. (they
can then claim higher costs/congestion/need for node splits which
increates regulated wholesale rates).

And would it be correct that in RFoG deployment, the 42mhz limit
disapears as the customer equipment talks directly tothe CMTS over fibre
all the way ?

Jean-Francois Mezei wrote:

Canadian cable carriers seem to have all told the CRTC they can only
carry 42mhz in the upstream because their amplifiers and nodes only
amplify that narrow band in the upstream direction.

Is/was 42mhz common across north america ?

42MHz was the traditional upper limit for annex b docsis. That limit
was extended up to 85MHz several years ago, but yeah there's probably a
lot of plant out there which can't go above 42MHz for legacy reasons.

Am trying to figure out realistic bandwidth that a cableco with 42mhz
limits for upstream will get on 3.1.

If the cableco is limited to 42MHz, there will be 37MHz of upstream
bandwidth (5 to 42), which allows five 6.4MHz upstream channels of
5120ksym/sec. 3.1 improves the upstream modulation from 64qam to
4096qam, which ups the bit throughput rate from 6 bits per symbol to 12
bits. That gives 5120*5*12 = 307200 of physical layer bit throughput,
and you should budget ~25-ish% for overhead to get usable customer bits
per second.

That's in lab conditions though. The reality is that you're not going
to be able to use qam4096 unless your upstream path has ridiculously
good SNR. If the cable network can't go above 42MHz, it's probably
legacy plant which implies older deployments and there's a real
likelihood that the improvements in DOCSIS 3.1 aren't going to make a
blind bit of difference. It would be probably be easier and more
reliable to do plant upgrades / service retirement to allow 85MHz (12
u/s channels) than clean up the plant so that you get the 30-35dB SNR
required to run 4096QAM. You can't make extra bandwidth out of nothing.

Also, have cablecos with such limits for upstream begun to upgrade the
cable plant to increase the upstream bandwidth ?

I would hope they have. If they don't, their businesses will be savaged
in the longer term by the introduction of gpon and other fiber technologies.

Nick

In our small, aging plant very near the Mexican border in south Texas, the SNR for <~30MHz is ~20 dB so we can only use two upstream channels. It works okay for our 150 cable modem customers. They can get 40 Mbps upstream throughput.

The downstream channels are around 300MHz with much better SNR so we can bond 8 channels. Depending on load, customers can get up to 80 Mbps downstream throughput.

This is on a DOCSIS 3.0 Cisco CMTS network with a 10 year old cable plant.

Lorell

Also, have cablecos with such limits for upstream begun to upgrade the
cable plant to increase the upstream bandwidth ? Canadian cablecos have
told the regulator it would be prohibitively expensive to do so, but
incumbents tend to exagerate these things when it is convenient. (they
can then claim higher costs/congestion/need for node splits which
increates regulated wholesale rates).

Going to D3.1 in a meaningful way means migrating to either a mid-split at 85 MHz or a high split at 200 MHz (117 MHz is in the spec but I've never heard anyone talk about it). It is not uncommon to see space (both for the upstream and downstream) freed up by sunsetting analog video channels. Yes, one has to do a truck roll and swap out amplifiers etc. but that is relatively straightforward. The "guts" pop out of the enclosure that hangs from the messenger wire and are then replaced. You don't need to actually put a wrench on a coax connector in order to do this. There may need to be plant rebalancing (checking and possibly replacing tilt compensators) but that's something that should be happening on an annual basis or perhaps more often, depending on local practice.

Fiber nodes are similar in terms of work to swap them out, though they may be more modular inside.

Amplifier insides: https://www.arris.com/globalassets/resources/data-sheets/starline-series-ble100-1-ghz-line-extender-data-sheet.pdf
Fiber node insides: CommScope | now meets next

Passives (splitters, directional taps, terminators, and the like) are bidirectional and typically do not need to be replaced.

Possibly useful reading for folks who want an overview of how it all goes together: http://www.cablelabs.com/wp-content/uploads/2014/12/DOCSIS3-1_Pocket_Guide_2014.pdf

Without having read the Canadian cable providers' representations to the CRTC I am ill-equipped to pass judgemenent on them, but in my personal opinion any discussion of "D3.1 deployment" that doesn't plan for a refactoring of splits is a bit dishonest.

And would it be correct that in RFoG deployment, the 42mhz limit
disapears as the customer equipment talks directly tothe CMTS over fibre
all the way ?

RFoG is its own kettle of fish. Getting more than one channel on upstream for RFoG is hard. There's a scheduler for transmitting on individual RF channels, but not for the upstream laser, so you could have two lasers coming on at the same time when two cablemodems (assume legacy D3.0 for a moment) transmit simultaneously on 19.40 MHz and 30.60 MHz - in an HFC plant where the mixing happens between two different radio frequencies in a copper wire and then feeds an single upstream fiber node, one doesn't have this problem.

-r

Going to D3.1 in a meaningful way means migrating to either a mid-split at 85 MHz or a high split at 200 MHz

Thanks. This is what I expected. But in the past, the canadian cablecos
had argued that removing the 42mhz upstream limitation was a huge
endeavour (they have to convicne CRTC to keep wholesale rates up, so
create artificial scarcity by claiming that replacing all those 42mhz
repeaters would cost a fortune, so they have to do node splits instead.

Arguing at CRTC is all about finding out what incumbent statements are
just spin and which are true.

Thanks for the links as well.é

RFoG is its own kettle of fish. Getting more than one channel on upstream for RFoG is hard.

But they can allocate a single very big channel, right ? Or did you
mean a single traditional NTSC 6mhz channel ?

Going to D3.1 in a meaningful way means migrating to either a mid-split at 85 MHz or a high split at 200 MHz

Thanks. This is what I expected. But in the past, the canadian cablecos
had argued that removing the 42mhz upstream limitation was a huge
endeavour (they have to convicne CRTC to keep wholesale rates up, so
create artificial scarcity by claiming that replacing all those 42mhz
repeaters would cost a fortune, so they have to do node splits instead.

In my opinion, that fails the sniff test. I don't have any particular budgetary information but I have a really hard time believing that pervasive node splits are cheaper than fixing the plant's US/DS splits.

By the way, just as one typically finds downstream DOCSIS channels in the 600-ish MHz range because that's the space that became freshly available when the plant got upgraded from 400 MHz to 800 MHz, one is likely to find that the 'fat' D3.1 OFDM upstream channels in the freshly-freed-up space that comes from doing the split realignment. Remember that you need to keep the old upstreams in order to support all the old crufty D2.0 and D3.0 (and, sadly, probably the odd D1.1) modems out there.

Arguing at CRTC is all about finding out what incumbent statements are
just spin and which are true.

Thanks for the links as well.é

RFoG is its own kettle of fish. Getting more than one channel on upstream for RFoG is hard.

But they can allocate a single very big channel, right ? Or did you
mean a single traditional NTSC 6mhz channel ?

They can allocate a single very big channel, but unlike QAM modulation, with OFDM you can have multiple stations transmitting at the same time on the same channel. So if anything, the optical beat interference from having more than one laser on at once is likely to be worse (for some values of worse - I don't know of anyone labbing such a thing up and trying to characterize just how bad it gets how fast with multiple transmitters - it might become intolerable with 2 on and it might not). I ran this past a colleague and he said "ewwwww why would anyone do D3.1 over RFoG?". I think that pretty much sums it up.

My personal opinion is that two-way RFoG is a super bad idea, but one-way RFoG on a WDM-separated channel to support legacy QAM (with PON for your high speed data) is OK, with the caveat that if you want two-way settop boxes, you're gonna have to figure out how to have your STBs speak Ethernet or MoCA or something to get out via your commodity high speed data connection. The latter is the way that FiOS does it.

-r