A Real Time Data Center Cooling System (RTDCCS) is created by combining ISECS (Server Enclosure Cooling System) with MECS (Multistage Evaporative Cooling System). By combining these two patented energy efficient cooling systems, the RTDCCS can save data centers 60 to 85 percent of their cooling energy costs.

Patents Real-Time Data Center Cooling System (RTDCCS)

MECS is an air and water cooling system that is used in the cooling of IT equipment applications, industrial process cooling applications, commercial and industrial building cooling applications, and hospital cooling applications. The unique cooling technology and components of the MECS uses the earth’s nature water cycle and laws of thermodynamics to produce cooling, does not incorporate any high energy using traditional mechanical refrigeration compressors, and does not use Freon type refrigerants, including hydrochlorofluorocarbons (HCFCs). MECS is scalable from 10 tons to over 3000 tons of equivalent mechanical refrigeration. 

Data Center Cooling

Individual Server Enclosure Cool System (ISECS) – R4 Ventures Fan Coil Unit
The Individual Server Enclosure Cooling System (ISECS) is a cold water and cold air delivery system that delivers process cooling to data center enclosures (Racks) with loads to 35 KW and higher on a Real Time basis.

  • Process cooling adjusts to cool the actual load of the Rack by increasing or decreasing water flow to the cooling coils and increasing or decreasing the fan speed in the ISECS fan coil unit to match the load in real time. (Accomplished via the real-time monitoring and control system)
  • Adjusts water flow and fan speed in ISECS fan coil unit to meet cooling loads in Real Time (loads can vary between low 1 KW and high 50 KW)
  • The anticipated hot air temperatures exiting the server racks in 125°F ±5°F.
  • Uses 75°F ± 5°F in the precooling stage (precooling coil in the ISECS fan coil unit) and 65°F ±45°F in the final cooling stage (final cooling coil in the ISECS fan coil unit), both cool water flows coming from MECS
  • Provides 70°F to 80.6°F (80.6°F is the ASHRAE recommended maximum white space air temperature) cool air back to Data Center space based on set point temperature.
  • Eliminates CRACs and CRAHs in the white space freeing up space for additional server racks. (More data center capacity and more revenue opportunities)
  • Eliminates hot aisle and/or cold aisle containment ducts
  • ISECS fan coil units can be incorporated into raised floor designs or placed above the Individual Racks over the aisles.

Comparison of Data Center White Space Temperature Profiles

Summary of Temperature Performance in Phoenix AZ, San Jose CA and Washington DC.

White Paper – Preliminary Temp Performance Evaluation MECS and RTDCCS 6-25-18

Data Centers – Real Time Data Center Cooling System (MECS + ISECS)

  1. Phoenix AZ
    • ASHRAE published Summer Design Conditions of .4% for evaporative applications – Data Center White Space temperature (for the entire compute space) can be maintained at a set point temperature of 76.8 °F (24.89 °C) completely eliminating hot aisles and cold aisles. No compressors and refrigerants are used in the system. Significant additional energy can be saved by maintaining an ASHRAE TC 9.9 (Class A1 components on Page 11) recommended set point temperature of 80.6 °F (27 °C) in the Data Center White Space which is well under the ASHRAE TC 9.9 allowable data center white space temperature of 89.6 °F (32 °C).
    • Based on the Monthly Mean Dry Bulb and Wet Bulb Temperatures – Data Center White Space temperature (for the entire compute space) can be maintained at a set point temperature of 75.5 °F (24.17 °C) in the hottest month of August completely eliminating hot aisles and cold aisles. No compressors and refrigerants are used in the system. Significant additional energy can be saved by maintaining an ASHRAE TC 9.9 (Class A1 components on Page 11) recommended set point temperature of 80.6 °F (27 °C) in the Data Center White Space which is well under the ASHRAE TC 9.9 allowable data center white space temperature of 89.6 °F (32 °C).
  2. San Jose CA
    • ASHRAE published Summer Design Conditions of .4% for evaporative applications – Data Center White Space temperature (for the entire compute space) can be maintained at a set point temperature of 74.39 °F (23.55 °C) completely eliminating hot aisles and cold aisles. No compressors and refrigerants are used in the system. Significant additional energy can be saved by maintaining an ASHRAE TC 9.9 (Class A1 components on Page 11) recommended set point temperature of 80.6 °F (27 °C) in the Data Center White Space which is well under the ASHRAE TC 9.9 allowable data center white space temperature of 89.6 °F (32 °C).
    • Based on the Monthly Mean Dry Bulb and Wet Bulb Temperatures – Data Center White Space temperature (for the entire compute space) can be maintained at a set point temperature of 73.56 °F (23.09 °C) in the hottest month of July completely eliminating hot aisles and cold aisles. No compressors and refrigerants are used in the system. Significant additional energy can be saved by maintaining an ASHRAE TC 9.9  (Class A1 components on Page 11) recommended set point temperature of 80.6 °F (27 °C) in the Data Center White Space which is well under the ASHRAE TC 9.9 allowable data center white space temperature of 89.6 °F (32 °C).
  3. Washington DC
    • ASHRAE published Summer Design Conditions of .4% for evaporative applications – Data Center White Space temperature (for the entire compute space) can be maintained at a set point temperature of 81.87 °F (27.71 °C) completely eliminating hot aisles and cold aisles. No compressors and refrigerants are used in the system. The system would operate about 7.73 °F below the ASHRAE TC 9.9 (Class A1 components on Page 11) allowable white space temperature of 89.6 °F (32 °C).
    • Based on the Monthly Mean Dry Bulb and Wet Bulb Temperatures – Data Center White Space temperature (for the entire compute space) can be maintained at a set point temperature of 76.20 °F (24.56 °C) in the hottest month of July completely eliminating hot aisles and cold aisles. No compressors and refrigerants are used in the system. Significant additional energy can be saved by maintaining an ASHRAE TC 9.9 (Class A1 components on Page 11) recommended set point temperature of 80.6 °F (27 °C) in the Data Center White Space.

Phoenix AZ

 

 

 San Jose CA

Washington DC