About five years ago, Harrison College was facing the reality that it was time to upgrade its data center. The big questions – whether to build new facilities or upgrade, or look to an entirely different approach – were all on the table. Joe Meadors, M.S.Ed., is the Senior Vice President – Information Services and Facilities at Harrison College, a complex of thirteen locations across Indiana, Ohio, and North Carolina, offering degrees in six disciplines, a total of over 30 fields of study.
At the time, Harrison College managed every component of its IT systems, keeping physical possession of its hardware and data, with all the expense and concerns that entailed, from worrying about natural disasters, to providing redundancy, to linking all the college’s campus systems, to the on-site challenges of climate control and physical security.
Not only was the existing system expensive, it did not lend itself well to expansion. But abandoning a long-held model with its known shortcomings for a modern but personally unfamiliar solution takes a lot of convincing.
Many headaches were eliminated when Harrison chose IFN. Harrison no longer had to foot the expenses of HVAC, security, and redundancy. The College didn’t have the looming demand for more IT space. Harrison didn’t have to build its own fiber network, or lease from multiple other providers. “Indiana Fiber Network (IFN) keeps everything safe in its remote data center; necessary data are accessible across all campuses and functions.”
IFN does not operate like some giant, with a set program and one-way communications. Meadors says, “Services are set up to accommodate each location’s, and each discipline’s needs. IFN’s scalable approach merged handily with Harrison’s own. This scalability underlay the confidence that allowed Harrison College and IFN to build the entire 13-campus network all at once, rather than employing a potentially confusing or costly rollout.”
Still, with multiple locations, things can go wrong. That’s where careful work, planning to meet the conditions of the RFP and tweaking details before the rollout, matter. Meadors said the planning was crucial, and well-done, so when it came time to flip the switch, “Most of it went as-planned. Our challenges were minimal and anticipated. When we turned it on, it worked right away, aside from a few things we had already anticipated. These were handled right away, and we were up and running.”
The need for additional capacity never goes away, and soon Harrison College was looking to add facilities. “Since the initial installation, we’ve added a couple locations,” Meadors said. “They went up easily. The model is absolutely scalable. Everything they designed initially has been sufficient. I think the RFP that we designed has met our needs well.”
Harrison was fortunate in that its planning covered its needs; what the planners saw were future problems that the existing model and facilities wouldn’t be able to handle. Harrison weren’t `behind,’ but they saw the day, in the near future, when they would be. And the old model was expensive, with no relief in sight. In other words, the need for a new model was obvious; the questions were what model to use, and when to implement it.
“Maybe you don’t want to build or expand a new data center.” Meadors continued, “Maybe they’d prefer to use a co-location solution. There is a big economic advantage in using their hub. Having a co-location facility has pretty dramatic economic and operational advantages. We don’t have to worry about cooling, power, physical security — all that is expensive — it’s all built into the IFN model.” So now, “We don’t worry about cooling, security, power, uptime.”
The IFN solution is simple in both concept and in execution. It allowed Harrison to maintain much of its existing structure – the parts that worked well. “Old data centers still exist, but only as spokes. The hub is IFN.” But the key to success is in the finished product: “The advantage of having our core data center in an IFN facility rather than in one of our buildings is a huge advantage.” Meadors explained, “Our gear is housed at the co-location — literally a wire cage with racks, in space they provide, at their hub. All our campuses come to the IFN cage. We built in redundancy, and it’s working really well. We have had terrific reliability with the IFN network, with very good response time.”