Manoj Kelkar

Author: Manoj Kelkar

Disaster Recovery Strategies For Big Data

As big data is becoming a mainstream business, organizations today are exploring this avenue to transform their business and provide their clients with insightful business intelligence. They have learned that the big data is not just a single technology, technique or an initiative. Instead, it is a trend that’s across many areas of business and technology. Using this technology they have started defining new initiatives and are in the process of reevaluating existing strategies.

 

The sudden growth in the data has made companies change the way they access mission-critical information, deploy applications, and approach data protection in general. So now the concern is whether these companies have a disaster recovery plan in place to recover from a catastrophic outage. A disaster recovery plan (DRP) can help get systems back online quickly and efficiently. This plan documents the procedures required to recover critical data and IT infrastructure after an outage. Here are some key points to consider while deciding the strategies for disaster recovery:

  • Setup & finalized RPO(Recovery Point Objective)

The recovery point objective (RPO) is the interval that equates to the maximum acceptable amount of data loss after an unplanned outage occurs. The RPO is so crucial because it determines the frequency at which you need to back up your data. We need to ask yourself how many hours of data loss as a result of systems going down can you manage or afford to lose. If your RPO is two hours of data, then we need to perform backups every two hours. All the management executives and the entire team needs to agree upon the RPO.

  • Offsite backups

The most obvious disaster recovery step is to keep the data stored in a remote location. Off-site backups ensure that data will remain unharmed during unforeseen situations like a natural calamity, fire etc. For Big Data, cloud-based backup is probably the best option because it is cheap and easy to backup your data to the cloud, particularly batch data, which is large and static.  

  • Conduct recovery tests regularly

In order to be confident about a disaster recovery plan, testing the plan is of utmost importance.  Best and safest approach is to test at least semi-annually for newly implemented disaster recovery plans and then yearly in future. The tests you conduct should verify that your disaster recovery procedures can restore Big Data workloads to meet the predefined RPO.

  • Use data recovery tools

It is important to ensure that you have good data recovery tools at your disposal during disaster recovery. This may require having backup instances of the tools available in case your production environments are destroyed.

  • Have continuity in data gathering

Disaster once occurred will not stop the data flow. During disaster recovery, you need to make sure that the data is captured continuously although your operations are not working. Make sure that your backup storage locations have enough spare capacity to manage new data which is generated at the time of restore operations.

In the rapidly changing environment, where organizations are prone to threats, it is vital that you have a robust and well-tested disaster recovery plan in place. We say that “Prevention is better than cure” and hence securing & backing up data using new technologies should be of utmost importance for the organizations.

 

Data Challenges faced by Virtual Reality

 

Virtual Reality (VR) in this information age is a buzzword, however soon the situation will change as many companies are understanding its application. As the technology is rapidly becoming more widespread and awareness of its potential is growing day by day, VR will go mainstream. However, like any new and revolutionary technology, with all the benefits it offers, it also offers a number of significant challenges. Virtual Reality comes with the challenge of how to deliver massive files over the internet.

 

The possibilities with VR are endless, however, the challenges need to be taken care of. Some of them are listed below:

  • Data Storage

File sizes generated by VR are significant and depending on the camera, can produce up to 1TB of data per hour. A sports organization streaming 60 games per year at three hours per game must be prepared to store more than 180 TB. Looking at such sizes, storing this data is the biggest barrier to companies who are planning to invest in VR. The ideal scenario for storing VR data must be one that enables organizations to access and edit their VR files as necessary, while also allowing them to economically archive them, thereby protecting and preserving data for as long as it’s needed.

  • Content Distribution

Technological development in the media sector will cause an enormous increase in data production and consumption. Content providers will need platforms that support VR content and a robust infrastructure with the capacity to support large amounts of data. Distributing these data will be a headache as VR content requires bandwidth multiple times higher than the normal data transfer. Traditional methods such as FTP and HTTP can’t transfer large volumes of data at different distances. We need to make sure we use faster file transfer methods. eSecureSend our SaaS solution moves data 20X faster than regular FTP. It is necessary for an organization to use a secure, reliable, easy to use and faster method for transferring the enormous VR files, anywhere at any point in time.

  • Device challenges

The VR devices are bulky and uncomfortable to wear as the field of view (FOV) is the biggest limitation. These devices have a FOV of up to 90 degrees, as compared to the 190 degrees horizontal and 120 degrees vertical which is necessary for a normal human vision. For these devices to create impressive experiences, they must capture as much of the FOV as possible. For the normal human eye to see, the projected image needs to be in a large FOV.

  • Higher cost & streaming

VR technology is very expensive. Oculus Rift and the HTC Vive cost upwards of $500 for the combo of headset and controllers. To solve one component of the latency problem, the driver for the content has to be powerful enough to maintain a recommended frame rate of more than 90 fps. This is the reason that VR content only works well on powerful machines. Streaming is also a challenge. Because the content needs to be in high resolution and needs to render at a much higher refresh rate and the bandwidth required for streaming grows a lot.

Virtual Reality is changing the internet and more VR content is being developed every day, but mass-market adoption and success are impossible without an effective, reliable and scalable content delivery strategy. If virtual reality is expected to be a future staple, there is plenty of work to be done around the technologies and the overall user experience.

 

The changing business of large file transfer- From FTP to SaaS

Businesses these days mainly operate on data like facts, figures, and values. However, the same data becomes a nasty drain clog when its stockpiles. Organizations tend to face significant financial losses if they can’t bring data to the right place at the right time. The flow of data across the Internet is steadily and constantly increasing, with annual global Internet Protocol (IP) traffic predicted to reach 3.3 zettabytes by 2021. As per the Cisco report, the overall IP traffic is expected to grow to 396 EB per month by 2022, up from 122 EB per month in 2017, a CAGR of 26 percent

Via https://www.cisco.com/c/en/us/solutions/collateral/service-provider/visual-networking-index-vni/white-paper-c11-741490.html

As most of the organizations need to transfer a huge amount of unstructured data across the borders, they tend to rely on the outdated FTP technology. The FTP method for data transfer solves less than half of the problem. This protocol doesn’t handle beyond 30 connections and its failure rate of 8% is very expensive. Apart from the costs surrounding this protocol, there are reliability, security, auditing and flexibility issues associated with it. Some the reasons why we need to stop using FTP are listed below https://www.esecuresend.com/why-you-need-to-stop-using-ftp/

The latest technology which is used for data transfer is managed file transfer which is a Software as a Service(SaaS) based platform. This platform enables data security, operational efficiency, governance, and compliance. Our product eSecureSend (eSS) is a managed file transfer solution using a SaaS platform for large data transfers. It is widely used across various industry segments like media, healthcare etc. and others which deal with video production, video marketing, and advertising. With our hands-on onboarding process and the sophisticated platform, eSS is the best solution for meeting any enterprise and B2B data transfer needs.

 

The SaaS Based platform is a software delivery model, in which the software and its associated data are hosted remotely – in the cloud. SaaS solutions enable user access through a web browser client – promoting collaboration between remote locations, and allowing quick customization. The flexibility and accessibility of SaaS solutions also enable to reduce the cost of IT maintenance and support.

 

eSecureSend our cloud-based managed file transfer solution using SaaS platform provides scalable, reliable, secure enterprise solutions to power the cloud and deliver value to your customers. It lets you manage a single, security-rich, highly reliable connection between you and your partners. This file transfer solution offers an alternative to more expensive FTP method and other on-premise software while helping in reducing the operational impact on your technology staff. eSS will provide complete end to end encryption of data at transit and at rest. All the PII is encrypted prior to upload. Block cipher is AES, which is the same that is being used by the United  States National Security Agency. It is further encrypted using the elliptical curve for additional security.

Try eSS Today!

30 Days Free Trial

Why There Is A Need For Backup and Disaster Recovery Plan In Business

An effective & reliable backup & disaster recovery plan is essential to virtually every business today. It has become a crucial part of every company’s IT strategy, and without it, you will face major challenges in case of cyber-attack, human error, or natural disaster etc. which ultimately results to data loss. Hence the amount of money and time invested in a backup and disaster recovery plan helps you to minimize risk while giving you peace of mind. Having this plan in place will ensure you’ve taken all the critical steps to protect your company’s data as a business owner. Security Week reported that the total volume of data loss in the enterprise rose more than 400 percent over the past few years, and IT Web reported that the total cost of data breaches will skyrocket and reach $2.1 trillion by 2019.

What is the backup & disaster recovery plan?

Backup & disaster recovery (BDR) plan combines a set of processes and techniques that work cohesively in order ensure that the company’s data is secure, backed up and can be recovered quickly after a disaster occurs. These incidents are typically: human error, inside threats, natural disasters, cyber-attacks or hardware failure.

 

Think of your disaster recovery plan as an insurance policy for your data and be rest assured that your business is safe. Here are the top reasons why your backup and disaster recovery plan should be a priority for your business continuity.

  • Clients data needs to be safe

If you are storing your client’s confidential data, you can’t afford to lose it or let it slip into the wrong hands. The backup and disaster recovery plan ensures that all of this information is properly stored and controlled. As a result, you don’t have to worry about damaging your brand reputation if any unforeseeable incident occurs.

 

  • Human errors

All disasters are not natural, sometimes it can also be human error. We can’t eradicate human error entirely, we can take steps to reduce it and ensure that recovery time is fast. Mistakes are bound to happen and a single poor choice can end up compromising data. That’s why it’s so important for businesses not only to train employees properly but also to invest in backup solutions.

 

  • Protection from natural disasters

There are plenty of situations which can cause a company to experience downtimes like flood, fire, hurricane or any natural calamity. If you have a backup and disaster recovery plan in place it will help you to protect your data and will make sure that the downtime does not affect your company. It will help your company to jump back into regular operations quickly, and in the process, lose little to no data.

 

  • System Fails

Even though the new modern IT hardware systems are resilient to failures, no machine is immune to internet connection failure nor any system crashes. It doesn’t matter what technology or how much you spend on it, no solution is flawless. As it will be a costly affair for a company to fix all these failures, having a BDR plan in place will help to reduce these risks. This plan will help you eliminate any capital expenses while ensuring the strictest protection from service interruptions due to IT infrastructure failures.

 

  • Minimize the cyber attacks impact

The rising rate of cybercrime doubles the need to fortify your business. Criminals these days are expanding their techniques to attack small businesses that they think are susceptible. The overall weak security maintained by several small businesses makes them an easy goal for attacks and probes. A foolproof BDR plan can help you in minimizing the impact of attacks and risks by cybercriminals. It will ensure the organization in data recovery and help in continuity in business.

 

Ultimately even if we try our best to prepare and prevent the worst from happening, some situations are out of our control. They say that” Precaution is better than cure”, hence the best course of action is to have a BDR plan in place to reduce the impact of a disaster on your business.

Dmorph Inc. announces the launch of its new service “eSS Lite”

DURHAM, NC — Dmorph Inc. has done it again. Today we announce the release of eSS Lite, an exclusive new file transfer service to transmit large files quickly, reliably, and securely. The new service, eSS Lite, represents a new industry best tool for managed file transfer and a new level of service for customers of all types and sizes.

With its ability to quickly, reliably, and securely transfer files of any size, eSS Lite will appeal to anyone who generates large files and datasets, including engineers, designers, scientists, photographers, artists, video producers, marketers, and other creative professionals, as well as the clients and colleagues they work with. eSS Lite addresses the file size limitations plaguing competing products, removing any need to break up large files before transfer. eSS Lite’s proprietary Perpetual Motion feature allows files to keep moving, even if a large file transfer is interrupted — transfers pick right up where they left off, saving customers precious time. All transfers are completely secure, with 256 Bit AES encryption protocol at the foundation of the eSS Lite product.

“We have released this service by keeping business users’ challenges in mind. We want our customers to focus on productivity, not on file transfer. We build all of our products keeping security at their core and then adding industry-leading speed and reliability to the mix. We focus ourselves on continuous product development so we can always provide our customers with the best tools in the industry.” says Mithila S, Dmorph’s Marketing Team Member.

eSS Lite allows customers to easily & quickly transfer files of different sizes. We offer several tiered usage plans based on individual customers’ requirements. All plans include Data Rollover, wherein unused data from one month’s plan allowance will be rolled over to the next billing period. Other features include:

  • Industry Leading Speed. With transfer speeds 20X faster than traditional FTP, our global network of servers provide you with the fastest transfer speeds wherever you are — and wherever your files are going.
  • Highest Security: All Dmorph products, including eSS Lite use end to end encryption for all transfers.
  • Detailed Reporting & Tracking: Our audit trail feature allows users to monitor ongoing file movements and review all transfers, and the interface is intuitive and easy-to-use.
  • Enhanced Collaboration: eSS’s Mission Control is a centralized control panel that enables designated administrators to monitor users and file movement.
  • Foolproof File Transfer: eSS’s Perpetual Motion feature prevents file transfer crashes, resuming transfers from the point of failure, saving our customers countless hours of precious time.

Dmorph is committed to improving the productivity of our customers by redefining the speed, security, and reliability of file transfer and storage. The launch of eSS Lite marks an exciting new tool for all high-volume data producers to manage their file sets with the same industry-leading features and specifications that were previously only available to enterprise-level customers.

For more information and to try eSS Lite, go to https://www.esecuresend.com/ess_lite_pricing.

Stewart Inc. Chooses eSecureSend For Reliable & Fast Delivery Of Data

Architecture, Engineering, and Construction organizations today require specialized expertise as they are distributed across many locations and have created serious collaboration challenges. Current challenges faced by workers who are present at remote locations include latency issues, document version control, and the ability for a worker to access, edit, and update files from anywhere.The size of digital projects in architectural & engineering companies is huge and hence moving large size CAD 3D drawings becomes a painful task.

Stewart Inc. a design, engineering and planning firm located in Raleigh, Durham and Charlotte, NC uses eSecureSend to deliver gigabytes of data to its clients. They have used eSS to transfer sensitive and large files to their clients making sure that the collaboration within the teams is handled smoothly. eSecureSend our secure file transfer solution helps the AEC industry build a strong foundation for their project success.

eSecureSend was a total game changer for Stewart. We asked one of their Manager Mr.Dustin Manning about the problems which they were experiencing before using eSecureSend for sending files. He said that  “it was difficult to find a cost effective, reliable service to transmit multiple gigabytes of data quickly”. eSecureSend helped them to resolved this problem.

 

eSecureSend was very easy to install and the process of uploading is effortless and amazingly fast”—–Dustin Manning, Manager Geomatics Technology, Stewart Inc.

 

eSecureSend provides a professional platform for easy sharing and collaborating on large files.

For us no file is too big and no site is too remote. Our simple SaaS file transfer solution helps to reduce waste of time in shipping hard drives across borders,saves money and delivers real-time access to blueprints, CAD/CAM files and designs for every construction project.

 

Stewart Inc. feels that eSecureSend is the simplest and most reliable method for transmitting large amounts of data to users or group of users. They say that they can connect to architects, surveyors and designers with engineers and workers on the ground with quick, simple, completely reliable, quick and secure file transfer method.

Challenges faced by content creators in adopting 4K technology

The continued spur in 4K TV manufacturing and sales is slowly but surely leading up to increased consumer demand for 4K media. However, content creators have yet to adopt it. There are significant challenges that affect every step in the media production workflow from ingestion to post-production. Delivery is especially burdened in getting content to broadcasters and to consumers because of the larger file sizes.

ATSC 3.0, the next gen broadcast transmission, is going to change everything. Broadcasters will be able to reliably deliver 4K content to consumers while also providing targeted advertising. It has been live for more than month now. If there ever was a time to be 4K ready, now’s the time to start.

What makes 4K so difficult to work with?

Simply, it’s the sheer size of the digital files. They’re factors bigger than what content creators are used to working with. It is 4 times the resolution of a 1080p display, and about 23 times the resolution of SD television. Since the resolution is higher, the file sizes are also much bigger.

You can literally see the difference here:

Post-production challenges of 4K content

Consumers are becoming more attracted towards the modern and advance technologies which in turn, has resulted in the rise in the demand for 4K services. Whether it is cinema or at home.

For a post production industry that suddenly has to manipulate and process images 70x what they were a scant couple of decades ago, the problems that the forthcoming massive leap in resolution about to engulf us brings are probably greater than most. As they always have to consider upgrading the services as per the demand of the market. The challenges faced by them are listed below:

  • More work to clean up shots – Problems caused by 4K which is 4 times sharper than HD are beginning to shop up with 4K team force, affected actors, casting directors, make-up artists, directors of photography and lighting directors. With 4K every detail like every wrinkle, every ounce of makeup and shadow is clearly visible, more so than with panoramic film. This has created more work and rules for everyone so that every detail is clear and perfect.

 

  • Computing Power & speed issues –  With 4k extra time is required  for processing as the footage takes wise as long to read, view and render. One also needs to makes sure  if your double, triple, or quadruple the shot, you need to check it before it is render. Otherwise you need to redo the job till it’s perfect. 4K footage requires much more computer power to edit and render. 4K is still very expensive. So astronomical, that outside of a few high-end private post-production/screening facilities, there isn’t any place you can actually see 4K resolution onscreen.

 

  • Storage – With 4K, in every frame there is a lot of data, and the performance needs of your infrastructure are higher. This makes it important that you have the right storage ready to support your workflow. Storing all of this extra data on expensive fast storage can be cost-prohibitive.

 

4K video has continued to become more and more prevalent in cinematography. And camera  being the prerequisite of the process, in order to manage rapid growth in resolution of the footage one needs to invest in correct camera equipments. Cameras and sensors used to capture content have to support very high frame rates, while image file formats must be suited to processing and integration into post-production workflows. These images can generate huge amounts of data which needs to be accessed, manipulated and stored, in real time. There are also costly 4K upgrades for broadcasters to consider in order to handle live broadcasts in 4K—encoding, switching, and other hardware.

Delivery challenges of 4K content

  • Getting content to partners and broadcasters – As the 4K content comprises of higher resolution, streaming/ delivering the content requires gobs of bandwidth. As companies still reply on the oldest method of content delivery i.e. FTP, which is really on a reliable and efficient solution in order to manage and deliver the content. As this method had couple of drawbacks we build our content delivery service eSecureSend in order to streamline the content delivery workflow. With eSS you can easily deliver your 4K data reliably, accurately and securely. It is a boon to companies who would like to deliver their data to its partners & broadcasters or vice versa.

 

  • Broadcast bandwidth requirement – One of the specific challenges facing the video broadcasting world is to deliver high-quality video content in 4K from point A to B at significant distances. As the transmission bandwidth of a certain link is fixed, say cable or cellular network, it is required to push more and better quality video data to consumption devices like TV, tablet and smartphones. While full HD demands less than 5 Gbit/s, 4K requirements jump to 10 Gbit/s, as 4K video is delivered uncompressed to ensure high picture quality with very low latency. That means that traditional cabling options are not capable of delivering 4K signals over distances of more than a meter, an unrealistic proposition in most commercial projects and some consumer applications.

 

  • Because of ATSC 3.0, these challenges are overcome and Fiber optic cables are increasingly being laid across the country –Know more

Finally, making the transformation to 4K technology will raise questions about technology capabilities, workflow considerations and where the industry is headed in the future. Adopting 4K production capability can be done in stages to minimize the impact to already existing HD production workflows, develop familiarity and best practices, and ultimately scale to meet the largest needs and future challenges of 4K production and beyond.

If delivery issues are overcome with accelerated file transfer services like eSecureSend and next gen transmission standards such as ATSC 3.0, what’s holding content creators from upgrading to 4K today?

Why the Cloud gives you flexibility in solving 4K challenges

The Cloud gives us tremendous capabilities and efficiencies that are like nothing we have seen before.Government, businesses, academic and all other organizations which embrace cloud technology are achieving competitive advantages from the better utilization of IT resources and faster service for their customers. Most of the challenges faced by adopting 4K technology are solved through cloud.

  • Flexibility to scale and pay for what you need as you use it – Cloud service can save companies costs on hardware,software and maintenance. Users can purchase the ability to access and use the application or service which is hosted on the cloud. Hence there is a flexibility for them to pay only for the amount time they have used the service.

 

  • Integrate with services that take the burden of upgrading equipment – Cloud service enables companies to deliver applications more efficiently by removing the complexities involved with the managing their own infrastructure. It enables fast deployment of applications and improves the agility of the IT services by instantly adding computing processing power and storage capacity when needed.

Vendors and integration platforms with proven track records which are providing cost-effective, reliable and simple delivery of 4K, a future-proof format for the delivery of ultra-high definition can help you to avoid all the pitfalls. Such partnerships can help you adapt to the latest 4K technology and would prove to be a painless affair.

When the telecommunications industry moved from a circuit-switched model to a packet-switched model  30 years ago, it was absolutely revolutionary. It paved the way for the telcos to become what they are today — they moved away from a capital-intensive, technology-focused business model to a more user-focused, service-centric business model. Television broadcasters are doing something similar right now with ATSC 3.0.

ATSC has geo targeting and personalization. It has interactivity. It has a targeted multicast function that is, theoretically, supposed to be for emergency uses such as 911 services. With ATSC 3.0 standard  the Consumer Technology Association, is expected to drive sales of 4K TVs whose higher-resolution pictures can be delivered by the new standard, and give broadcasters a competitive foothold in the interactive, targeted advertising, IP world.

Content creators are beginning to respond with new content that takes advantage of all the new features UHD TV sets have to offer and they’re moving fast too. So the impasse is starting to look more like a virtuous cycle with content availability driving interest in upgrades from last-generation flat panels and premium display features showing-off just how great that content can look.

 

 

Experience Attending Broadcast India Show 2017 in Mumbai

Broadcast India Show 2017 was held on 12-14 Oct 2017 at Mumbai, India. Companies and corporates, veterans and professionals, suppliers and customers, visionaries and stalwarts from the broadcast and entertainment industry, had optimizing opportunities, facilitating trade links and enabling info-exchange on a global level. I had an opportunity to be a part of the event this year.

I am writing this blog shortly after the event in Mumbai and will admit that I’ve found the whole conference experience amazing – such a privilege for me to be able to attend. We had preregistered ourselves for the event and hence it was a smooth entry for us, as we collected our badges and lanyards. It was a pretty huge expo with 415 companies with their enormous stalls from across the world.

As we started meeting different companies, we came across an Indian company named Digital Navigation who were into casting live events to studios via mobile app. We exchanged ideas about our product benefits and were overwhelmed with the response we got from them. During discussion we found out that their basic pain points were bandwidth, signal breakage during file transfer as they were still using old technology FTP.

There were conferences scheduled for all 3 days of the event. The 2 conferences which caught my attention were on “File based QC on Cloud:Demystified”, by Vikas Singhal from Venera Technologies Pvt. Ltd. and “You can Sell any Content Anytime- Film or TV” by Ramesh Meer, MD of Global Content Bazar 2017. All of these events really contributed to the buzz surrounding the conference and certainly made me feel like I was part of an engaged, valued audience.

We came across an interesting drone company which explained to us about covering all the concerts & sports events which happen in India. In order to transfer the data they are using a system called lime stream. We spoke about eSS benefits to them, they told us that they are looking for collaborations for transferring data.

After speaking to couple of key people in the media industry, we learned that OTT and MAM (Media asset management) companies would require a service for transferring large files. We also found out that couple of our competitors are collaborating with such companies for streamlining their file transfer process.

As far as for promoting and marketing eSS in this media vertical, there are couple of industry focused magazines and online marketing avenues. Some of the examples of such companies which we came across in BroadcastIndia Show 2017 were Broadcast Video Producer, Broadcast & Cable Satellite.

The most interesting part of attending the event was when we got an opportunity to attend a demo of company named AVID based out of Denmark. The demo was about their system integration with GLOOKAST and how their dashboard provides the clients freedom to use convenient service. Their senior manager Mr. Henning Braendeholm was impressed with our product eSS and opened the doors for partnership.

Overall it was a very good learning experience in understanding the media industry and the key requirements. There were diverse views from customers, industry pundits, media experts and many more – With this Broadcast India Show 2017, I’ve certainly got rich insights to the industry facts. Looking forward to attend many such events in near future.