Infrastructure First: Inside the Dream Center’s $10M Transformation
Walking into the Dream Center in Dallas, you immediately notice two things: the stunning 1940s self-supporting dome overhead, and the fact that this 85-year-old building somehow looks like it was designed yesterday for modern church production.
That’s not an accident.
Jake and I spent the day with Tyler Mergy from Skylark, Kris Smith (Technical Director), Ryan Bates (Media Director), and Kacie Kintz (Dream Center Director) exploring how they transformed a historic landmark into a fully functional campus for Churchfront 1132 while maintaining its historical status. The project involved over $10 million in renovations, and the AV integration alone tells a masterclass story about doing things right the first time.
The Challenge: Historical Constraints Meet Modern Expectations
Here’s the problem Tyler and his team faced: everything in this building is plaster and concrete. No drywall. No accessible ceiling spaces. And the historical society had veto power over virtually every decision.
“The historical society did not allow us to put an outfill on the wall,” Tyler explained. There are literally four seats in the room that don’t get proper coverage because they couldn’t mount a single X8 speaker.
But here’s what makes this interesting: instead of fighting the constraints, Skylark built a system that works with the building’s limitations while preparing for a future that doesn’t exist yet.
Infrastructure Over Gear (Every Single Time)
Tyler dropped this line that should be written on the wall of every church equipment room: “We’d rather sell Blackmagic with patch bay than a Ross Alteryx without patch bay.”
Let that sink in. They’d rather spec cheaper gear with proper infrastructure than expensive gear with quick-and-dirty cabling.
Why? Because infrastructure outlasts gear. Tyler estimates 20 years for quality infrastructure versus 10 years for processing equipment (though he thinks churches should push longer on console upgrades).
Here’s what that philosophy looks like in practice:
Every audio connection lands on patch bay. Not just some connections. Every. Single. One. This means when they wanted to swap their DM48 for an Avantis console with a GX4816, they literally just unplugged cables and plugged them back in. Nothing changed on stage. No re-terminating. No new cable runs.
All network infrastructure is fiber-backed 10Gig. Six network switches across the facility, all properly integrated. Not “this is my Dante switch” and “this is my KVM switch.” One unified network that can handle anything they throw at it.
DMX over Cat6 using Luminex 12 RJ45 converters. This is brilliant. Instead of running 5-pin DMX everywhere, they’re running Cat6 to tile line panels throughout the building. Need DMX somewhere? Patch it. Need more than one universe? Drop a node. The flexibility is infinite.
The Motor Control Decision
They installed quarter-ton Columbus McKinnon entertainment hoists with installed motor control on three lighting positions. The upfront cost made them pause, but Tyler’s math is simple: the motor control system probably paid for itself with the first two lift rentals they didn’t need.
Think about that. No more taking out pews. No more expensive atrium lift rentals. No more scheduling nightmares. Just plug in the control panel downstairs and drop the truss.
Each motor point has full tie-line panels with multiple Cat6 runs for DMX, network, and analog audio (because why not put crowd mic capability everywhere?). They even have rig points and tie-line panels for future IMAG screens that don’t exist yet.
Building for Tomorrow’s Gear Today
This is where Skylark’s approach gets really interesting. Look at their rack layout:
- Rack 1: Graphics machines and computers, with open slots ready for Resi decoders and additional playback machines
- Rack 2: Video processing with everything landing on patch bay (so swapping that Blackmagic router for a Ross Alteryx is literally just re-patching cables)
- Rack 3: The destination rack with tons of space for HyperDecks, media servers, and fiber infrastructure for CCU-based cinema cameras
The Nova Star LED processor? Wasn’t there on day one. But when they added LED walls for their conference, they just patched the processor into existing tie-lines. No emergency cable runs across the floor.
The Canon PTZ camera provisions? Built in. The IMAG screen positions? Rigged and ready with power and data.
The Audio Strategy: Allen & Heath Ecosystem
Every 1132 campus runs Allen & Heath, and at the Dream Center they’re running an Avantis for front of house with a GX4816 stage rack and DX168 expander. The monitor mixes run off the same console.
Kris is all-in on the ME-1 personal mixers: “It’s probably the most simple and easiest personal mixer that anybody can approach at a high channel count.” All backline musicians run wired ME-1s, while vocalists get wireless – Shure QLXD for vocals, Axient for pastors, and PSM 900 for in-ears.
One detail I loved: they sunk X12 wedges directly into the stage floor behind metal grates. Pull up the grate for access, otherwise you have a completely clean stage. Same with the KS28 subs under the stage – there’s a hatch for service access but otherwise they’re completely hidden.
PA System: L-Acoustics in a Difficult Room
The main system is L-Acoustics A15i arrays (constant curvature with adjustable angles) and KS21 subs flown, plus three KS28 dual-18s under the stage and five XT front fills hidden in the stage mesh.
The acoustic challenges are significant. They couldn’t treat the walls. They couldn’t add acoustic panels. The historical society said no to everything.
So Tyler’s approach was pure targeting: “Get audio to where audio needs to be and minimize reflections.” They used L-Acoustics modeling to see exactly where the arrays were hitting and turned on the walls in the model to minimize HF splash. The low-frequency buildup got addressed in tuning, with the understanding that as they’re eventually allowed to treat the space, they can bring some of that low end back.
Lighting: Working with a Dome
You can’t fly a straight truss in a dome. So instead of fighting it, they used the original I-beams that run between the architectural pillars. Three lighting positions hit the stage better than a center truss would have anyway, and they preserve the architectural beauty of the dome.
The fixtures are a mix of Chauvet Rogue R2X variable whites for specials and Rogue R3 wash lights. Kris raves about the R3s: “Honestly probably one of the best face lights you could possibly get.” They also have Chauvet Strike Array 2 blinders for big moments.
Everything runs on DMX through those Cat6 tie-line panels, controlled by a HedgeHog console. And here’s a detail that matters: the architectural dome lighting isn’t just decorative. It’s fully DMX controlled in zones and integrated into the lighting console. The architect specified placement, Skylark gained control, and now they can use the building itself as part of the worship experience.
The Broadcast Room That Isn’t (But Could Be)
Right now, the Dream Center doesn’t live stream services. But the entire facility was built as if they do.
ATEM Constellation 2ME switcher. Full KVM system with Adder XDIP transceivers. Green Go IP-based comms with Dante integration. Multiple operator positions that can control any computer from any location.
Why build all this if you’re not streaming? Because when they want to stream, or when they host their annual conference, or when they need to do campus-to-campus communication with their other locations, everything is already there.
The KVM system is particularly clever. Control-Alt-C switches between computers, so the operator position itself doesn’t matter. CG could be at front of house or in the broadcast room. Engineering could access router control from anywhere. It’s all about workflow flexibility.
Power Done Right
Two separate power systems: ND (non-dim) for lighting and LED power, and IG (isolated ground) for audio and video. The isolated ground runs through its own transformer, eliminating ground lift issues before they start.
No relay panels though. In Tyler’s view, they’re becoming obsolete. Everything has battery backups and needs proper shutdown sequences anyway. Amplifiers go into standby via Q-Sys, and everything else shuts down like a computer should.
The LinTech 16-channel motor controller? Installed by the electrician, and Tyler says it’s “shocking how much it doesn’t cost” compared to what people fear.
What This Means for Your Church
You don’t need a $10 million budget to apply these principles:
- Invest in infrastructure first. Patch bays, proper network backbone, conduit in walls. This stuff lasts 20 years. Your console will be replaced twice in that time.
- Build for the future you can’t afford yet. Run extra Cat6. Put in tile-line panels. Provision camera positions. When budget appears, you’re ready.
- Use patch bays to future-proof console decisions. Allen & Heath today, Yamaha tomorrow – your stage doesn’t care if everything lands on patch bay first.
- Consider installed motor control early. If your electrician is already opening walls and running conduit, the incremental cost is surprisingly low compared to renting lifts forever.
- Think network-first for everything. DMX, video, control, audio. If it can run on Cat6, run it on Cat6 and give yourself options.
The Dream Center proves you can honor history while building for the future. You can work within constraints while maintaining flexibility. And you can make infrastructure investments that serve ministry for decades instead of chasing the next gear upgrade.
Want to see all this in action? Watch the full tech tour on our YouTube channel. And if you’re planning a renovation or new build, the Churchfront team can connect you with integrators who think infrastructure-first. Reach out at churchfront.com.
Church 1132 Dream Center (Oak Cliff Campus)
Integration: Skylark
Technical Director: Kris Smith
Media Director: Ryan Bates
Dream Center Director: Kacie Kintz