technology

Category: technology

The changing business of large file transfer- From FTP to SaaS

Businesses these days mainly operate on data like facts, figures, and values. However, the same data becomes a nasty drain clog when its stockpiles. Organizations tend to face significant financial losses if they can’t bring data to the right place at the right time. The flow of data across the Internet is steadily and constantly increasing, with annual global Internet Protocol (IP) traffic predicted to reach 3.3 zettabytes by 2021. As per the Cisco report, the overall IP traffic is expected to grow to 396 EB per month by 2022, up from 122 EB per month in 2017, a CAGR of 26 percent

Via https://www.cisco.com/c/en/us/solutions/collateral/service-provider/visual-networking-index-vni/white-paper-c11-741490.html

As most of the organizations need to transfer a huge amount of unstructured data across the borders, they tend to rely on the outdated FTP technology. The FTP method for data transfer solves less than half of the problem. This protocol doesn’t handle beyond 30 connections and its failure rate of 8% is very expensive. Apart from the costs surrounding this protocol, there are reliability, security, auditing and flexibility issues associated with it. Some the reasons why we need to stop using FTP are listed below https://www.esecuresend.com/why-you-need-to-stop-using-ftp/

The latest technology which is used for data transfer is managed file transfer which is a Software as a Service(SaaS) based platform. This platform enables data security, operational efficiency, governance, and compliance. Our product eSecureSend (eSS) is a managed file transfer solution using a SaaS platform for large data transfers. It is widely used across various industry segments like media, healthcare etc. and others which deal with video production, video marketing, and advertising. With our hands-on onboarding process and the sophisticated platform, eSS is the best solution for meeting any enterprise and B2B data transfer needs.

 

The SaaS Based platform is a software delivery model, in which the software and its associated data are hosted remotely – in the cloud. SaaS solutions enable user access through a web browser client – promoting collaboration between remote locations, and allowing quick customization. The flexibility and accessibility of SaaS solutions also enable to reduce the cost of IT maintenance and support.

 

eSecureSend our cloud-based managed file transfer solution using SaaS platform provides scalable, reliable, secure enterprise solutions to power the cloud and deliver value to your customers. It lets you manage a single, security-rich, highly reliable connection between you and your partners. This file transfer solution offers an alternative to more expensive FTP method and other on-premise software while helping in reducing the operational impact on your technology staff. eSS will provide complete end to end encryption of data at transit and at rest. All the PII is encrypted prior to upload. Block cipher is AES, which is the same that is being used by the United  States National Security Agency. It is further encrypted using the elliptical curve for additional security.

Try eSS Today!

30 Days Free Trial

Why There Is A Need For Backup and Disaster Recovery Plan In Business

An effective & reliable backup & disaster recovery plan is essential to virtually every business today. It has become a crucial part of every company’s IT strategy, and without it, you will face major challenges in case of cyber-attack, human error, or natural disaster etc. which ultimately results to data loss. Hence the amount of money and time invested in a backup and disaster recovery plan helps you to minimize risk while giving you peace of mind. Having this plan in place will ensure you’ve taken all the critical steps to protect your company’s data as a business owner. Security Week reported that the total volume of data loss in the enterprise rose more than 400 percent over the past few years, and IT Web reported that the total cost of data breaches will skyrocket and reach $2.1 trillion by 2019.

What is the backup & disaster recovery plan?

Backup & disaster recovery (BDR) plan combines a set of processes and techniques that work cohesively in order ensure that the company’s data is secure, backed up and can be recovered quickly after a disaster occurs. These incidents are typically: human error, inside threats, natural disasters, cyber-attacks or hardware failure.

 

Think of your disaster recovery plan as an insurance policy for your data and be rest assured that your business is safe. Here are the top reasons why your backup and disaster recovery plan should be a priority for your business continuity.

  • Clients data needs to be safe

If you are storing your client’s confidential data, you can’t afford to lose it or let it slip into the wrong hands. The backup and disaster recovery plan ensures that all of this information is properly stored and controlled. As a result, you don’t have to worry about damaging your brand reputation if any unforeseeable incident occurs.

 

  • Human errors

All disasters are not natural, sometimes it can also be human error. We can’t eradicate human error entirely, we can take steps to reduce it and ensure that recovery time is fast. Mistakes are bound to happen and a single poor choice can end up compromising data. That’s why it’s so important for businesses not only to train employees properly but also to invest in backup solutions.

 

  • Protection from natural disasters

There are plenty of situations which can cause a company to experience downtimes like flood, fire, hurricane or any natural calamity. If you have a backup and disaster recovery plan in place it will help you to protect your data and will make sure that the downtime does not affect your company. It will help your company to jump back into regular operations quickly, and in the process, lose little to no data.

 

  • System Fails

Even though the new modern IT hardware systems are resilient to failures, no machine is immune to internet connection failure nor any system crashes. It doesn’t matter what technology or how much you spend on it, no solution is flawless. As it will be a costly affair for a company to fix all these failures, having a BDR plan in place will help to reduce these risks. This plan will help you eliminate any capital expenses while ensuring the strictest protection from service interruptions due to IT infrastructure failures.

 

  • Minimize the cyber attacks impact

The rising rate of cybercrime doubles the need to fortify your business. Criminals these days are expanding their techniques to attack small businesses that they think are susceptible. The overall weak security maintained by several small businesses makes them an easy goal for attacks and probes. A foolproof BDR plan can help you in minimizing the impact of attacks and risks by cybercriminals. It will ensure the organization in data recovery and help in continuity in business.

 

Ultimately even if we try our best to prepare and prevent the worst from happening, some situations are out of our control. They say that” Precaution is better than cure”, hence the best course of action is to have a BDR plan in place to reduce the impact of a disaster on your business.

Challenges faced by content creators in adopting 4K technology

The continued spur in 4K TV manufacturing and sales is slowly but surely leading up to increased consumer demand for 4K media. However, content creators have yet to adopt it. There are significant challenges that affect every step in the media production workflow from ingestion to post-production. Delivery is especially burdened in getting content to broadcasters and to consumers because of the larger file sizes.

ATSC 3.0, the next gen broadcast transmission, is going to change everything. Broadcasters will be able to reliably deliver 4K content to consumers while also providing targeted advertising. It has been live for more than month now. If there ever was a time to be 4K ready, now’s the time to start.

What makes 4K so difficult to work with?

Simply, it’s the sheer size of the digital files. They’re factors bigger than what content creators are used to working with. It is 4 times the resolution of a 1080p display, and about 23 times the resolution of SD television. Since the resolution is higher, the file sizes are also much bigger.

You can literally see the difference here:

Post-production challenges of 4K content

Consumers are becoming more attracted towards the modern and advance technologies which in turn, has resulted in the rise in the demand for 4K services. Whether it is cinema or at home.

For a post production industry that suddenly has to manipulate and process images 70x what they were a scant couple of decades ago, the problems that the forthcoming massive leap in resolution about to engulf us brings are probably greater than most. As they always have to consider upgrading the services as per the demand of the market. The challenges faced by them are listed below:

  • More work to clean up shots – Problems caused by 4K which is 4 times sharper than HD are beginning to shop up with 4K team force, affected actors, casting directors, make-up artists, directors of photography and lighting directors. With 4K every detail like every wrinkle, every ounce of makeup and shadow is clearly visible, more so than with panoramic film. This has created more work and rules for everyone so that every detail is clear and perfect.

 

  • Computing Power & speed issues –  With 4k extra time is required  for processing as the footage takes wise as long to read, view and render. One also needs to makes sure  if your double, triple, or quadruple the shot, you need to check it before it is render. Otherwise you need to redo the job till it’s perfect. 4K footage requires much more computer power to edit and render. 4K is still very expensive. So astronomical, that outside of a few high-end private post-production/screening facilities, there isn’t any place you can actually see 4K resolution onscreen.

 

  • Storage – With 4K, in every frame there is a lot of data, and the performance needs of your infrastructure are higher. This makes it important that you have the right storage ready to support your workflow. Storing all of this extra data on expensive fast storage can be cost-prohibitive.

 

4K video has continued to become more and more prevalent in cinematography. And camera  being the prerequisite of the process, in order to manage rapid growth in resolution of the footage one needs to invest in correct camera equipments. Cameras and sensors used to capture content have to support very high frame rates, while image file formats must be suited to processing and integration into post-production workflows. These images can generate huge amounts of data which needs to be accessed, manipulated and stored, in real time. There are also costly 4K upgrades for broadcasters to consider in order to handle live broadcasts in 4K—encoding, switching, and other hardware.

Delivery challenges of 4K content

  • Getting content to partners and broadcasters – As the 4K content comprises of higher resolution, streaming/ delivering the content requires gobs of bandwidth. As companies still reply on the oldest method of content delivery i.e. FTP, which is really on a reliable and efficient solution in order to manage and deliver the content. As this method had couple of drawbacks we build our content delivery service eSecureSend in order to streamline the content delivery workflow. With eSS you can easily deliver your 4K data reliably, accurately and securely. It is a boon to companies who would like to deliver their data to its partners & broadcasters or vice versa.

 

  • Broadcast bandwidth requirement – One of the specific challenges facing the video broadcasting world is to deliver high-quality video content in 4K from point A to B at significant distances. As the transmission bandwidth of a certain link is fixed, say cable or cellular network, it is required to push more and better quality video data to consumption devices like TV, tablet and smartphones. While full HD demands less than 5 Gbit/s, 4K requirements jump to 10 Gbit/s, as 4K video is delivered uncompressed to ensure high picture quality with very low latency. That means that traditional cabling options are not capable of delivering 4K signals over distances of more than a meter, an unrealistic proposition in most commercial projects and some consumer applications.

 

  • Because of ATSC 3.0, these challenges are overcome and Fiber optic cables are increasingly being laid across the country –Know more

Finally, making the transformation to 4K technology will raise questions about technology capabilities, workflow considerations and where the industry is headed in the future. Adopting 4K production capability can be done in stages to minimize the impact to already existing HD production workflows, develop familiarity and best practices, and ultimately scale to meet the largest needs and future challenges of 4K production and beyond.

If delivery issues are overcome with accelerated file transfer services like eSecureSend and next gen transmission standards such as ATSC 3.0, what’s holding content creators from upgrading to 4K today?

Why the Cloud gives you flexibility in solving 4K challenges

The Cloud gives us tremendous capabilities and efficiencies that are like nothing we have seen before.Government, businesses, academic and all other organizations which embrace cloud technology are achieving competitive advantages from the better utilization of IT resources and faster service for their customers. Most of the challenges faced by adopting 4K technology are solved through cloud.

  • Flexibility to scale and pay for what you need as you use it – Cloud service can save companies costs on hardware,software and maintenance. Users can purchase the ability to access and use the application or service which is hosted on the cloud. Hence there is a flexibility for them to pay only for the amount time they have used the service.

 

  • Integrate with services that take the burden of upgrading equipment – Cloud service enables companies to deliver applications more efficiently by removing the complexities involved with the managing their own infrastructure. It enables fast deployment of applications and improves the agility of the IT services by instantly adding computing processing power and storage capacity when needed.

Vendors and integration platforms with proven track records which are providing cost-effective, reliable and simple delivery of 4K, a future-proof format for the delivery of ultra-high definition can help you to avoid all the pitfalls. Such partnerships can help you adapt to the latest 4K technology and would prove to be a painless affair.

When the telecommunications industry moved from a circuit-switched model to a packet-switched model  30 years ago, it was absolutely revolutionary. It paved the way for the telcos to become what they are today — they moved away from a capital-intensive, technology-focused business model to a more user-focused, service-centric business model. Television broadcasters are doing something similar right now with ATSC 3.0.

ATSC has geo targeting and personalization. It has interactivity. It has a targeted multicast function that is, theoretically, supposed to be for emergency uses such as 911 services. With ATSC 3.0 standard  the Consumer Technology Association, is expected to drive sales of 4K TVs whose higher-resolution pictures can be delivered by the new standard, and give broadcasters a competitive foothold in the interactive, targeted advertising, IP world.

Content creators are beginning to respond with new content that takes advantage of all the new features UHD TV sets have to offer and they’re moving fast too. So the impasse is starting to look more like a virtuous cycle with content availability driving interest in upgrades from last-generation flat panels and premium display features showing-off just how great that content can look.

 

 

How Google Fiber Reshaped Where Startups Do Business

The Triangle has become one of the strongest entrepreneurial tech hubs in the United States offering quality of life and a lower cost of living that Silicon Valley cannot compete with. Along with that has come a bevy of innovation and small companies able to compete on a global scale. It has also brought about a growth rate that some say will add an additional 1 million people to the Triangle by the end of 2018, which has both the locals and the infrastructure struggling to keep up with the demands.

Akin to our overburdened, jammed highways during rush hours, local company Dmorph Inc. found the infrastructure of the existing information superhighway couldn’t keep up with their growth. In July 2017, Dmorph Inc. moved their physical office location from the American Undergound in Durham to office space in neighboring Morrisville based exclusively on the availability of faster, more robust internet of Google Fiber. A new twist on the old adage: location, location, location.

This is a prime example of technology reshaping the way companies do business and in this case the way a tech company physically does business. Dmorph’s golden child and primary product is eSecureSend. eSecureSend is a large data transfer service that is bucking the 40 year norm of companies relying on unreliable FTP and curtailing the expensive, time consuming and environmentally harmful practice of shipping hard drives.

For the lay person, the large portion of large data transfer service referenced here is data that is 1000x past the max capacity of a free Dropbox account. This is a service for entities that are sending terabytes of data as a routine operating practice. It takes a lot to make those beautiful 4k movies and shows you’re enjoying. Primary customers include media entities, government entities and even medical entities. All of which means the data needs securely transferred at the speed of business.

While the invent and physical logistics of Google Fiber has come with some challenges, many of which have been reported in the Triangle, the product itself is solid. Like most startups, Dmorph started out by utilizing AWS and Google Cloud. Then came the point of business where the best financial and strategic move was to get off the cloud and come down to business owned servers. This point of a tech company’s growth is almost a modern day milestone for tech business success.

At this juncture the reduced costs paired with increased transfer speeds offered by Fiber restructured Dmorph’s balance sheet to where they could invest in the hardware needed to create their own servers. It trickles down to their customer base by allowing them to be more flexible in their pricing model without sacrificing service or security.

Dmorph had been presented the options to partner with a data center, however they would have continually faced data and/or bandwidth caps that would limit the performance of their accelerated file transfer service.

As company spokesperson Yasmeen Kashef put it, “We have big dreams to make file transfer secure, reliable and effective. Having access to high-speed internet, like Google Fiber is just the first step. And because Google Fiber exists, all the other telecoms are building their high-speed internet services too. We look forward to ever increasing high-speed internet.

“The other plus is that it’s helped us get together as a team more often. With our Durham office, half of our team members would have to drive about 40 minutes. Now that we’re in Morrisville, it’s a 15min drive for everyone. Even in the age of technology and at a tech firm, more face-to-face collaboration is a bonus.”

At the end of the day human interaction and location, location, location still reign supreme in business.

Experience Attending Broadcast India Show 2017 in Mumbai

Broadcast India Show 2017 was held on 12-14 Oct 2017 at Mumbai, India. Companies and corporates, veterans and professionals, suppliers and customers, visionaries and stalwarts from the broadcast and entertainment industry, had optimizing opportunities, facilitating trade links and enabling info-exchange on a global level. I had an opportunity to be a part of the event this year.

I am writing this blog shortly after the event in Mumbai and will admit that I’ve found the whole conference experience amazing – such a privilege for me to be able to attend. We had preregistered ourselves for the event and hence it was a smooth entry for us, as we collected our badges and lanyards. It was a pretty huge expo with 415 companies with their enormous stalls from across the world.

As we started meeting different companies, we came across an Indian company named Digital Navigation who were into casting live events to studios via mobile app. We exchanged ideas about our product benefits and were overwhelmed with the response we got from them. During discussion we found out that their basic pain points were bandwidth, signal breakage during file transfer as they were still using old technology FTP.

There were conferences scheduled for all 3 days of the event. The 2 conferences which caught my attention were on “File based QC on Cloud:Demystified”, by Vikas Singhal from Venera Technologies Pvt. Ltd. and “You can Sell any Content Anytime- Film or TV” by Ramesh Meer, MD of Global Content Bazar 2017. All of these events really contributed to the buzz surrounding the conference and certainly made me feel like I was part of an engaged, valued audience.

We came across an interesting drone company which explained to us about covering all the concerts & sports events which happen in India. In order to transfer the data they are using a system called lime stream. We spoke about eSS benefits to them, they told us that they are looking for collaborations for transferring data.

After speaking to couple of key people in the media industry, we learned that OTT and MAM (Media asset management) companies would require a service for transferring large files. We also found out that couple of our competitors are collaborating with such companies for streamlining their file transfer process.

As far as for promoting and marketing eSS in this media vertical, there are couple of industry focused magazines and online marketing avenues. Some of the examples of such companies which we came across in BroadcastIndia Show 2017 were Broadcast Video Producer, Broadcast & Cable Satellite.

The most interesting part of attending the event was when we got an opportunity to attend a demo of company named AVID based out of Denmark. The demo was about their system integration with GLOOKAST and how their dashboard provides the clients freedom to use convenient service. Their senior manager Mr. Henning Braendeholm was impressed with our product eSS and opened the doors for partnership.

Overall it was a very good learning experience in understanding the media industry and the key requirements. There were diverse views from customers, industry pundits, media experts and many more – With this Broadcast India Show 2017, I’ve certainly got rich insights to the industry facts. Looking forward to attend many such events in near future.

The Real Questions to Assess Cost Versus Benefit When Adopting New Technology

A 2013 study conducted by MIT Sloan Management conclude that now, more than ever, companies have 2 choices, “adopt new technologies effectively or face competitive obsolescence.” Ouch. Of the 1559 executives and managers they interviewed, 78% said, “achieving digital transformation will become critical to their organizations within the next two years.” Are you one of the 78%?

Read more

Understanding Internet Privacy In The Digital Age

With Congress’ repeal of the FCC’s new internet privacy regulations — so new, in fact, they hadn’t even been put into use — the conversation about the data market has returned to the national spotlight. Unfortunately, despite how it directly affects every person who uses the internet, nearly no one understands any of it. What is ‘privacy’ in the digital age? How should we talk about it?

The digital world and the traces we leave therein have little in common with our ideas about privacy in the physical world. Hiding your screen from prying eyes doesn’t do much to prevent data collection. Counter intuitively, using your home network for sensitive online activity instead of public networks might even be less private. It turns out what happens behind closed doors — and happens routinely behind the same closed door — is actually easier to collect.

What’s the better way of conceptualizing all of it, then? And how might consumers be more conscious of the consequences of their choices?

First, consider the actors at stake. Companies like Facebook and Google, called “edge providers,” have been able to collect data, use it for advertising, or sell it for ad purposes for some time now — but the data they collect is restricted to your activity on their sites and through their ad network. Simply put (perhaps too simply), Google only knows what you do on Google.

Internet Service Providers (ISPs), like AT&T or Comcast, have a much more intimate view of your activity as anything you do on the internet necessarily goes through them — medical information, financial information, and all sorts of other sensitive stuff one might do in the private light of their laptop. But the restrictions ISPs face in using that data have been more stringent. In arguing for the repeal of the FCC’s regulations, ISPs say it’s unfair they can’t engage with the big data market like edge providers can.

The usual metaphor characterizes ISPs as the ‘roads’ one uses to visit the ‘stores’ of edge providers. But if we are to understand digital privacy in terms of the physical world, perhaps a more useful way of thinking about it might be this: ISPs are our personal assistants, our messengers. Our stand-ins. The ones we send out to pick up our prescriptions. The ones we trust with our PINs. The ones that know where we live. It may sound alarmist, but it is important to remember that these ‘roads’ know an awful lot about us.

There are still regulations in place preventing the connection of collected data with any particular person. Also, being voracious consumers of the internet, we often connect through many different ISPs on a given day: wifi at home, 4G on our commutes, public networks at coffee shops, restaurants, airports, et cetera. As such, our activity is often segmented and collected piecemeal — unless we always keep our sensitive information behind the same closed door.

Male freelancer works on laptop computer sitting at a coffee shop looking worried

Should we despair? How does the least tech-savvy among us retain their privacy? The details will always require technical knowledge and time to sort through, but there are options for the rest of us. Consider how you navigate the internet and where you do it. If you can stand a bit slower of a connection, resources like Tor are invaluable in helping disguise your online activity. Use HTTPS instead of HTTP when you can — it’s a newer and safer protocol that encrypts the data you share with any given edge provider. So, while the fact that you spent 5 hours on Facebook is clearly available to your ISP, using HTTPS at least masks the particulars of your stalking.

While we often have no choice in ISPs, we can opt to use more conscientious edge providers. Take note of the privacy and data collection policies of the companies you patronize. If you have sensitive data to transfer, don’t just throw it up on Google Drive — research more secure ways of sharing your files. If you need to search the internet privately, consider different search engines. While sifting through EULAs can be exhausting and confusing, a little thought goes a long way. Just know you can’t expect closing the blinds to do much good.