This post is a little late - almost 6 years late.

I was scrolling through my photos and saw that I had documented more of this build than I remembered.

In 2020, our building was closed (like everyone's were). Worship still had to happen though. We had a parking structure that we could use as an outdoor venue, a production team, but no permanent internet connection in the structure itself.

What we did have was a deadline, a congregation expecting a live service, and whatever infrastructure already existed between buildings.

This is how we extended our network across ~1,088 feet of legacy copper and delivered a stable connection for worship and livestream from the top of a parking garage.

The Constraint

The parking structure had:

  • No ISP handoff
  • No fiber
  • No Ethernet runs
  • No active network ports

However, it did have an elevator.

Elevators require phone connectivity, which meant there was existing twisted pair running between buildings and the structure. That was the only physical pathway available.

That became the plan.

POTS punchdown block
Punchdown block for elevator phone lines

Validating the Physical Layer

Before deploying anything, I needed to confirm:

  1. The copper pair connected to a building with an internet connection
  2. The distance was within range of available extenders.
  3. The pair was usable and not degraded.

I found another phone punchdown block in a nearby building and fortunately, the punchdown block was in the IDF. Identifying which pair ran to the parking structure took some time, but once located, I could begin testing.

POTS punchdown block
Took a little bit of time to identify which lines I can use that goes into the parking structure

Using a Fluke MicroScanner2, I measured the run.

It came in at approximately 1,088 feet (~331 meters) — well beyond standard Ethernet limits (100m), but within the supported range of the DSL extender pair we had available.

Fluke Networks MicroScanner2 with 2 pairs connected
Testing POTS connection using a Fluke Microscanner2

Because this was phone line (POTS wiring), only two conductors mattered. After identifying the correct pair at both punchdown blocks, we prepared to extend the network.

Extending Layer 2 Over Copper

The solution was a DSL-based Ethernet extender pair (VDSL2 transport).

Building Side:

  • Layer 3 switch with a fiber uplink to the core switch
  • Patch into identified phone pair
  • DSL transmitter

Parking Structure Side:

  • Terminate matching pair
  • DSL receiver
  • Connect to field switch
  • Cat6 up to the 3rd floor (roof)
  • Connect to PoE switch

This effectively extended our Layer 2 network across the buildings using infrastructure never intended for Ethernet.

Electrical room in the parking structure that has a DSL receiver conencted to a phone block and switch
Right: Photo of DSL receiver connected to phone punchdown block. Left: DSL receiver connected to a 5-port switch

Incremental Validation

Before introducing production equipment, I needed to validate connectivity step by step:

  1. DSL sync established.
  2. Switch connected.
  3. Link tested with NetAlly LinkSprinter.
  4. Confirmed DHCP, gateway reachability, and internet access.
Netscout Linksprinter showing a link connection, DHCP, DNS and Internet
Green lights on the LinkSprinter

Only after confirming full path integrity did we proceed.

We then performed throughput testing.

Speedtest that shows the results

Results:

  • ~22ms latency
  • ~49 Mbps down
  • ~32 Mbps up

That provided sufficient headroom for a stable 1080p livestream with margin for overhead.

Production & Network Stack

Once we knew we're able to have a reliable connection for livestream, we shifted focus to portable solutions we had available and started testing.

2 laptops on a bench connected to a Blackmagic ATEM TV Studio for testing
Testing a Blckmagic ATEM TV Studio

The network extension provided uplink to the building, but the rooftop still required local distribution.

Inside the portable rack:

  • Blackmagic ATEM Television Studio (video switching)
  • Blackmagic UltraStudio 4K (capture/interface)
  • Blackmagic Hyperdeck (for recording)
  • (2) Mac Minis (stream encoding and ProPresenter)
  • (2) 8-port PoE switches
  • Power distribution
  • Local switching for production devices
Portable livestream rack
Portable livestream rack

The PoE switch served two purposes:

  1. Distributed wired connectivity to production gear.
  2. Powered a mounted access point for musician devices.

Temporary Wireless for Musicians

We mounted an access point to a rooftop lighting truss to provide controlled wireless access for musician iPads.

Person on a ladder installing lighting on a truss
Truss that was sandbagged and anchored for extra stability in case of windy conditions

The goal wasn’t general internet access — it was:

  • Reliable connectivity for charts
  • Communication tools
  • Production-related apps

Because our bandwidth was roughly 40Mbps, we intentionally did not provide guest WiFi.

This was a deliberate decision:

  • Reduce RF congestion.
  • Eliminate non-essential traffic.
  • Protect available uplink bandwidth.
  • Minimize support variables on event day.

The wireless network was purpose-built for production and musicians only.

Field Deployment

This wasn’t a studio environment.

It was:

  • Rooftop exposure
  • Temporary truss
  • Outdoor lighting
  • Weather considerations
  • Folding tables and temporary control stations
Production crew testing audio and video during worship band rehersals
Production crew getting ready for a service

Power distribution, cable management, and signal integrity all had to function in a non-permanent environment.

Why Reliability Mattered

This wasn’t a technical experiment.

It was live worship services (including Christmas and Easter)

People attending Easter services on the roof of a parking structure
Easter services on top of the parking structure

Hundreds of people gathered in a parking structure because indoor gatherings weren’t possible. The livestream extended that reach further.

The network path had to hold. At one point during this build, I sent a Calvin and Hobbes image to some fellow Church IT friends. It was a reminder that none of us were building under ideal conditions but, we were building anyway.

Calvin and Hobbes strip with Calvin saying "My brain is trying to kill me."

What I’d Do Differently Today

Looking back, a few improvements stand out:

  • Install a UniFi AirFiber between buildings
  • Label temporary power distribution more clearly
  • Pre-stage more cabling to reduce setup time
  • Portable rack with wheels similar to the audio gear for more ventilation

The architecture worked. But there’s always room to refine risk mitigation.

Final Thoughts

This deployment reinforced a simple principle:

When faced with constraints, look for existing physical pathways before assuming new infrastructure is required.

In this case, elevator copper became a temporary backbone.

Layer by layer, we validated:

  • Physical continuity
  • Link establishment
  • Network reachability
  • Throughput
  • Production workflow

The result was a stable outdoor livestream built on infrastructure that was never intended for data transport.

Sometimes engineering isn’t about ideal conditions. It’s about understanding the systems in front of you and extending them responsibly.