Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Tech News - Cloud Computing

175 Articles
article-image-aws-announces-more-flexibility-its-certification-exams-drops-its-exam-prerequisites
Melisha Dsouza
18 Oct 2018
2 min read
Save for later

AWS announces more flexibility its Certification Exams, drops its exam prerequisites

Melisha Dsouza
18 Oct 2018
2 min read
Last week (on 11th October), the AWS team announced that they are removing the exam-prerequisites to give users more flexibility on the AWS Certification Program. Previously, it was a prerequisite for a customer to pass the foundational or Associate level exam before appearing for the Professional or Specialty certification. AWS has now eliminated this prerequisite, taking into account customers requests for flexibility. Customers are no longer required to have an Associate certification before pursuing a Professional certification. Nor do they need to hold a Foundational or Associate certification before pursuing Specialty certification. The professional level exams are pretty tough to pass. Until a customer has a complete deep knowledge of the AWS platform, passing the professional exam is difficult. If a customer skips the Foundational or Associate level exams and directly appears for the professional level exams, he will not have the practice and knowledge necessary to fare well in them. Instead, if he/she fails the exam, backing up to the Associate level can be demotivating. The AWS Certification demonstrates helps individuals obtain an expertise to design, deploy, and operate highly available, cost-effective, and secure applications on AWS. They will gain a  proficiency with AWS which will help them earn tangible benefits This exam will help Employers Identify skilled professionals that can use  AWS technologies to lead IT initiatives. Moreover, the exams will help them reduce risks and costs to implement their workloads and projects on the AWS platform. AWS dominates the cloud computing market and the AWS Certified Solutions Architect exams can help candidates secure their career in this exciting field. AWS offers digital and classroom training build cloud skills and prepare for certification exams. To know more about this announcement, head over to their official Blog. ‘AWS Service Operator’ for Kubernetes now available allowing the creation of AWS resources using kubectl Machine Learning as a Service (MLaaS): How Google Cloud Platform, Microsoft Azure, and AWS are democratizing Artificial Intelligence AWS machine learning: Learning AWS CLI to execute a simple Amazon ML workflow [Tutorial]  
Read more
  • 0
  • 0
  • 10346

article-image-twilio-flex-a-fully-programmable-contact-center-platform-is-now-generally-available
Bhagyashree R
18 Oct 2018
3 min read
Save for later

Twilio Flex, a fully-programmable contact center platform, is now generally available

Bhagyashree R
18 Oct 2018
3 min read
Yesterday, Twilio announced the general availability of Flex. Since its preview announcement in March, Flex has been used by thousands of contact center agents including support and sales teams at Lyft, Scorpion, Shopify, and U-Haul. Twilio Flex is a fully-programmable contact center platform that aims to give businesses complete control over customer engagement. It is a cloud-based platform that provides infinite flexibility in your hands. What functionalities does Flex provide to enterprises? Twilio Flex enables enterprises to do the following: Answer user queries using Autopilot Flex provides a conversational AI platform called Autopilot using which businesses can build custom messaging bots, IVRs, and home assistant apps. These bots are trained with the data pulled by Autopilot using Twilio’s natural language processing engine. Companies can deploy those bots across multiple channels including voice, SMS, Chat Alexa, Slack, and Google Assistant. With these bots, enterprises can also respond to frequently asked questions and if the queries become complex the bots can then transfer the conversation to a human agent. Secure phone payment with Twilio Pay With only one line of code, you can activate the Twilio Pay service that provides businesses the tools needed to process payments over the phone. It relies on secure payment methods such as tokenization to ensure that credit card information is securely handled. Provide a true omnichannel experience Flex gives enterprises access to a number of channels out of the box including voice, SMS, email, chat, video, and Facebook Messenger, among others. Also, agents can switch from channel to channel without losing the conversation or context. Customize user interface programmatically Flex user interfaces are designed with customization in mind. Enterprises can customize the customer-facing components like click-to-call or click-to-chat. It also allows adding entirely new channels or integrating new reporting dashboards to display agent performance or customer satisfaction. Integrate any application Enterprises can integrate their third-party business-critical applications with Flex. These applications may include systems such as customer relationship management (CRM), workforce management (WFM), reporting, analytics, or data stores. Analytics and insights for better customer experience It offers real-time event stream, a supervisor desktop, and admin desktop, which gives supervisors and administrators complete visibility and control over interaction data. Using these analytics and insights they will be able to better monitor and manage an agent’s performance. To know more about Twilio Flex, check out their official announcement. Twilio acquires SendGrid, a leading Email API Platform, to bring email services to its customers Twilio WhatsApp API: A great tool to reach new businesses Building a two-way interactive chatbot with Twilio: A step-by-step guide
Read more
  • 0
  • 0
  • 1811

article-image-jeff-bezos-amazon-will-continue-to-support-u-s-defense-department
Richard Gall
16 Oct 2018
2 min read
Save for later

Jeff Bezos: Amazon will continue to support U.S. Defense Department

Richard Gall
16 Oct 2018
2 min read
Just days after Google announced that it was pulling out of the race to win the $10 billion JEDI contract from the Pentagon, Amazon's Jeff Bezos has stated that Amazon will continue to support Pentagon and Defense projects. But Bezos went further, criticising tech companies that don't work with the military. Speaking at Wired25 Conference, the Amazon chief said "if big tech companies are going to turn their back on U.S. Department of Defense (DoD), this country is going to be in trouble... One of the jobs of senior leadership is to make the right decision, even when it’s unpopular." Bezos remains unfazed by criticism It would seem that Bezos isn't fazed by criticism that other companies have faced. Google explained its withdrawal by saying "we couldn’t be assured that it would align with our AI Principles." However, it's likely that the significant internal debate about the ethical uses of AI, as well as a wave of protests against Project Maven earlier in the year were critical components in the final decision. Microsoft remains in the running for the JEDI contract, but there appears to be much more internal conflict over the issue. Anonymous Microsoft employees have, for example, published an open letter to senior management on Medium. The letter states: "What are Microsoft's AI Principles, especially regarding the violent application of powerful A.I. technology? How will workers, who build and maintain these services in the first place, know whether our work is being used to aid profiling, surveillance, or killing?" Clearly, Jeff Bezos isn't too worried about upsetting his employees. Perhaps the story says something about the difference in the corporate structure of these huge companies. While they all have high-profile management teams, its only at Amazon that the single figure of Bezos reigns supreme in the spotlight. With Blue Origin he's got his sights set on something far beyond ethical decision making - sending humans into space. Cynics might even say it's the logical extension of the implicit imperialism of his enthusiasm for Pentagon.
Read more
  • 0
  • 0
  • 4245

article-image-twilio-acquires-sendgrid-a-leading-email-api-platform-to-bring-email-services-to-its-customers
Natasha Mathur
16 Oct 2018
3 min read
Save for later

Twilio acquires SendGrid, a leading Email API Platform, to bring email services to its customers

Natasha Mathur
16 Oct 2018
3 min read
Twilio Inc., a universal cloud communications platform announced yesterday, that it is acquiring SendGrid, a leading email API platform. Twilio focussed mainly on providing voice calling, text messaging, video, web, and mobile chat services. SendGrid, on the other hand, focused purely on providing email services. With this acquisition, Twilio aims to bring tremendous value to the combined customer bases by offering services around voice, video, chat as well as email. “Email is a vital communications channel for companies around the world, and so it was important to us to include this capability in our platform. The two companies share the same vision, the same model, and the same values,” mentioned Jeff Lawson, Twilio's co-founder and chief executive officer. The two companies will also be focussing on making it easy for developers to build a communications platform by delivering a single, best-in-class platform for developers. This would help them better manage all of their important communication channels including voice, messaging, video, and email. Moreover, as per the terms of the deal, SendGrid will become a wholly-owned subsidiary of Twilio. Once the deal is closed, SendGrid’s common stock will get converted into Twilio’s stock. “At closing, each outstanding share of SendGrid common stock will be converted into the right to receive 0.485 shares of Twilio Class A common stock, which represents a per share price for SendGrid common stock of $36.92 based on the closing price of Twilio Class A common stock on October 15, 2018. The exchange ratio represents a 14% premium over the average exchange ratio for the ten calendar days ending, October 15, 2018”, reads Twilio’s press release. The boards of directors of both Twilio and SendGrid have approved the above-mentioned transaction. “Our two companies have always shared a common goal - to create powerful communications experiences for businesses by enabling developers to easily embed communications into the software they are building. Our mission is to help our customers deliver communications that drive engagement and growth, and this combination will allow us to accelerate that mission for our customers”, said Sameer Dholakia, SendGrid’s CEO. The acquisition will be closing in the first half of 2019. This is subject to the satisfaction of customary closing conditions, including approvals by shareholders of each SendGrid’s and Twilio’s. “We believe this is a once-in-a-lifetime opportunity to bring together the two leading developer-focused communications platforms to create the unquestioned platform of choice for all companies looking to transform their customer engagement”, said Lawson. For more information, check out the official Twilio press release. Twilio WhatsApp API: A great tool to reach new businesses Make phone calls and send SMS messages from your website using Twilio Building a two-way interactive chatbot with Twilio: A step-by-step guide
Read more
  • 0
  • 0
  • 3458

article-image-platform9-announces-a-new-release-of-fission-io-the-open-source-kubernetes-native-serverless-framework
Sugandha Lahoti
16 Oct 2018
3 min read
Save for later

Platform9 announces a new release of Fission.io, the open source, Kubernetes-native Serverless framework

Sugandha Lahoti
16 Oct 2018
3 min read
Platform9 is announcing a new release of Fission.io, the open source, Kubernetes-native Serverless framework.  It’s new features enable developers and IT Operations to improve the quality and reliability of serverless applications. Fission comes with built-in Live-reload and Record-replay capabilities to simplify testing and accelerate feedback loops. Other new features include Automated Canary Deployments to reduce the risk of failed releases, Prometheus integration for automated monitoring and alerts, and fine-grained cost and performance optimization capabilities. With this latest release, Fission also allows Dev and Ops teams to safely adopt Serverless and benefit from the speed, cost savings, and scalability of this cloud-native development pattern on public cloud or on-premises. Let’s look at the features in detail. Live-reload: Test as you type With Live-reload, Fission automatically deploys the code as it is written into a live Kubernetes test cluster. It allows developers to toggle between their development environment and the runtime of the function, to rapidly iterate through their coding and testing cycles. Record-replay: Simplify testing and debug (Image via Fission) Record-replay automatically saves events that trigger serverless functions and allows for the replaying of these events on demand. Record-replay can also reproduce complex failures during testing or debugging, simplify regression testing, and troubleshoot issues. Operations teams can use recording on a subset of live production traffic to help engineers reproduce issues or verify application updates. Automated Canary Deployments: Reduce the risk of failed releases Fission provides fully automated Canary Deployments that are easy to configure. With AutomatedCanary Deployments, it automatically increments traffic proportions to the newer version of the function as long as it succeeds and rolls back to the old version if the new version fails. Prometheus Integration: Easy metrics collection and alerts Integration with Prometheus enables automatic aggregation of function metrics, including the number of functions called, function execution time, success, failures, and more. Users can also define custom alerts for key events, such as for when a function fails or takes too long to execute. Prometheus metrics can also feed monitoring dashboards to visualize application metrics. (Image via Fission) One of Fission’s users Kenneth Lam, Director of Technology at Snapfish said, “Fission allows our company to benefit from the speed, cost savings and scalability of a cloud-native development pattern on any environment we choose, whether it be the public cloud or on-prem.” You can learn more about Fission on its website. You can also go through a quick demo of all the new features in Fission. How to deploy Serverless Applications in Go using AWS Lambda [Tutorial]. Azure Functions 2.0 launches with better workload support for serverless. How Serverless computing is making AI development easier
Read more
  • 0
  • 0
  • 4873

article-image-microsoft-announces-decentralized-identity-in-partnership-with-dif-and-w3c-credentials-community-group
Bhagyashree R
12 Oct 2018
3 min read
Save for later

Microsoft announces ‘Decentralized Identity’ in partnership with DIF and W3C Credentials Community Group

Bhagyashree R
12 Oct 2018
3 min read
Yesterday, Microsoft published a white paper on Decentralized Identity (DID) solution. These identities are user-generated, self-owned, globally unique identifiers rooted in decentralized systems. Over the past 18 months, Microsoft has been working towards building a digital identity system using blockchain and other distributed ledger technologies. With these identities aims to enhance personal privacy, security, and control. Microsoft has been actively collaborating with members of the Decentralized Identity Foundation (DIF), the W3C Credentials Community Group, and the wider identity community. They are working with these groups to identify and develop critical standards. Together they plan to establish a unified, interoperable ecosystem that developers and businesses can rely on to build more user-centric products, applications, and services. Why decentralized identity (DID) is needed? Nowadays, people use digital identity at work, at home, and across every app, service, and device. Access to these digital identities such as email addresses and social network IDs can be removed at any time by the email provider, social network provider, or other external parties. Users also give permissions to numerous apps and devices, which calls for a high degree of vigilance of tracking who has access to what information. This standards-based decentralized identity system empowers users and organizations to have greater control over their data. This system addresses the problem of users granting broad consent to countless apps and services. It provides them a secure encrypted digital hub where they can store their identity data and easily control access to it. What it means for users, developers, and organizations? Benefits for users It enables all users to own and control their identity Provides secure experiences that incorporate privacy by design Design user-centric apps and services Benefits for developers It allows developers to provide users personalized experiences while respecting their privacy Enables developers to participate in a new kind of marketplace, where creators and consumers exchange directly Benefits for organizations Organizations can deeply engage with users while minimizing privacy and security risks Provides a unified data protocol to organizations to transact with customers, partners, and suppliers Improves transparency and auditability of business operations To know more about decentralized identity, read the white paper published by Microsoft. Microsoft joins the Open Invention Network community, making 60,000 of its patents accessible to fellow members Microsoft invests in Grab; together aim to conquer the Southeast Asian on-demand services market with Azure’s Intelligent Cloud Microsoft announces Project xCloud, a new Xbox game streaming service, on the heels of Google’s Stream news last week
Read more
  • 0
  • 0
  • 11610
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at €18.99/month. Cancel anytime
article-image-microsoft-invests-in-grab-together-aim-to-conquer-the-southeast-asian-on-demand-services-market-with-azures-intelligent-cloud
Natasha Mathur
09 Oct 2018
2 min read
Save for later

Microsoft invests in Grab; together aim to conquer the Southeast Asian on-demand services market with Azure’s Intelligent Cloud

Natasha Mathur
09 Oct 2018
2 min read
Microsoft announced, yesterday, that it is collaborating with Grab, the leading on-demand transportation, mobile payments and online-to-offline services platform in Southeast Asia, as part of a strategic cloud partnership. The partnership aims to transform the delivery of digital services and mobility by using Microsoft’s state-of-the-art expertise in machine learning and other artificial intelligence (AI) capabilities. “Our partnership with Grab opens up new opportunities to innovate in both a rapidly evolving industry and growth region. We’re excited to team up to transform the customer experience as well as enhance the delivery of digital services for the millions of users who rely on Grab for safe and affordable transport, food and package delivery, mobile payments, and financial services”, mentioned Peggy Johnson, executive vice president at Microsoft. Grab is a Singapore-based technology company delivering ride-hailing, ride sharing, and logistics services via its app in Singapore and neighboring Southeast Asian nations. It currently operates in 235 cities across eight Southeast Asian countries.  Moreover, Grab’s digital wallet, GrabPay, is the top player in Southeast Asia. This partnership is expected to help both companies explore a wide range of innovative deep technology projects such as mobile facial recognition with built-in AI for drivers and customers, using Microsoft Azure’s fraud detection services to prevent fraudulent transactions on Grab’s platform, and so on. These projects aim to transform the experience for Grab’s users, driver-partners, merchants as well as agents. Grab will be adopting Microsoft Azure as its preferred cloud platform and Microsoft is set to make a strategic investment in Grab; the magnitude of which currently undisclosed. “As a global technology leader, Microsoft’s investment into Grab highlights our position as the leading homegrown technology player in the region. We look forward to collaborating with Microsoft in the pursuit of enhancing on-demand transportation and seamless online-to-offline experiences for users”, said Ming Maa, president of Grab. There are a few other areas of collaborations between Grab and Microsoft. These include Microsoft Outlook integration, Microsoft Kaizala, In-car solutions, and integration of Microsoft Rewards Gift Cards. For more information, check out the official Microsoft blog. Microsoft open sources Infer.NET, it’s popular model-based machine learning framework Microsoft announces Project xCloud, a new Xbox game streaming service, on the heels of Google’s Stream news last week Microsoft’s new neural text-to-speech service lets machines speak like people
Read more
  • 0
  • 0
  • 4066

article-image-google-opts-out-of-pentagons-10-billion-jedi-cloud-computing-contract-as-it-doesnt-align-with-its-ethical-use-of-ai-principles
Bhagyashree R
09 Oct 2018
3 min read
Save for later

Google opts out of Pentagon’s $10 billion JEDI cloud computing contract, as it doesn’t align with its ethical use of AI principles

Bhagyashree R
09 Oct 2018
3 min read
Yesterday, Google announced that they will be not be competing for the Pentagon’s cloud-computing contract which is supposedly worth $10 billion. They opted out of bidding for the project named, Joint Enterprise Defense Infrastructure (JEDI) saying the project may conflict with its principles for the ethical use of AI. The JEDI project involves moving massive amounts of Pentagon internal data to a commercially operated secure cloud system. The bidding for this contract began two months ago and closes this week (12th October). CNBC reported in July that Amazon is considered as the number one choice for the contract because it is already providing services for the cloud system used by U.S intelligence agencies. Cloud providers such as IBM, Microsoft, and Oracle are also top-contenders as they have worked with government agencies for many decades. This move could help their chances of winning the decade-long JEDI contract. Why Google has dropped out of this bidding? One of Google’s spokespersons told TechCrunch that the main reason for opting out of this bidding is because it doesn’t align with their AI principles: “While we are working to support the US government with our cloud in many areas, we are not bidding on the JEDI contract because first, we couldn’t be assured that it would align with our AI Principles and second, we determined that there were portions of the contract that were out of scope with our current government certifications.” He further added that: “Had the JEDI contract been open to multiple vendors, we would have submitted a compelling solution for portions of it. Google Cloud believes that a multi-cloud approach is in the best interest of government agencies, because it allows them to choose the right cloud for the right workload. At a time when new technology is constantly becoming available, customers should have the ability to take advantage of that innovation. We will continue to pursue strategic work to help state, local and federal customers modernize their infrastructure and meet their mission critical requirements.” Also, this decision is a result of thousands of Google employees protesting against the company's involvement in another US government project named Project Maven. Earlier this year, some of the Google employees reportedly quit over the company's work on this project. Its employees believed that the U.S. military could weaponize AI and apply the technology towards refining drone strikes and other kinds of lethal attacks. An internal petition was also drafted for Google CEO Sundar Pichai to cancel Project Maven and was signed by over 3,000 employees. After this protest, Google said it would not renew the contract or pursue similar military contracts. Further, Google also formulated its principles for the ethical use of AI. You can read the full story on Bloomberg. Bloomberg says Google, Mastercard covertly track customers’ offline retail habits via a secret million dollar ad deal Ex-googler who quit Google on moral grounds writes to Senate about company’s “Unethical” China censorship plan Google slams Trump’s accusations, asserts its search engine algorithms do not favor any political ideology
Read more
  • 0
  • 0
  • 4839

article-image-githubs-new-integration-for-jira-software-cloud-aims-to-provide-teams-a-seamless-project-management-experience
Bhagyashree R
08 Oct 2018
2 min read
Save for later

GitHub’s new integration for Jira Software Cloud aims to provide teams a seamless project management experience

Bhagyashree R
08 Oct 2018
2 min read
Last week, GitHub announced that they have built a new integration to enable software teams to connect their code on GitHub.com to their projects on Jira Software Cloud. This integration updates Jira with data from GitHub, providing a better visibility into the current status of your project. What are the advantages of this new GitHub and Jira integration? No need to constantly switch between GitHub and Jira With your GitHub account linked to Jira, your team can see the branches, commit messages, and pull request in the context of the Jira tickets they’re working on. This integration provides a deeper connection by allowing you to view references to Jira in GitHub issues and pull requests. Source: GitHub Improved capabilities This new GitHub-managed app provides improved security, along with the following capabilities: Smart commits: You can use smart commits to update the status, leave a comment, or log time without having to leave your command line or GitHub View from within a Jira ticket: You can view associated pull requests, commits, and branches from within a Jira ticket Searching Jira issues: You can search for Jira issues based on related GitHub information, such as open pull requests. Check the status of development work: The status of development work can be seen from within Jira projects Keep Jira issues up to date: You can automatically keep your Jira issues up to date while working in GitHub Install the Jira Software and GitHub app to connect your GitHub repositories to your Jira instance. The previous version of the Jira integration will be deprecated in favor of this new GitHub-maintained integration. Once the migration is complete, the legacy integration (DVCS connector) is disabled automatically. Read the full announcement at the GitHub blog. 4 myths about Git and GitHub you should know about GitHub addresses technical debt, now runs on Rails 5.2.1 GitLab raises $100 million, Alphabet backs it to surpass Microsoft’s GitHub
Read more
  • 0
  • 0
  • 5129

article-image-cloudera-hortonworks-merge-to-advance-cloud-development-artificial-intelligence
Sugandha Lahoti
04 Oct 2018
2 min read
Save for later

Cloudera and Hortonworks merge to advance hybrid cloud development, Edge and Artificial Intelligence

Sugandha Lahoti
04 Oct 2018
2 min read
Cloudera and Hortonworks have announced a corporate partnership to jointly become a data platform provider, spanning multi-cloud, on-premises and the Edge. They will also accelerate innovation in IoT, streaming, data warehouse, and Artificial Intelligence. This merger will also expand market opportunities for Hortonworks DataFlow and Cloudera Data Science Workbench along with partnerships with public cloud vendors and systems integrators. Tom Reilly, chief executive officer at Cloudera, called their merger as highly complementary and strategic. He said, “By bringing together Hortonworks’ investments in end-to-end data management with Cloudera’s investments in data warehousing and machine learning, we will deliver the industry’s first enterprise data cloud from the Edge to AI.” Rob Bearden, chief executive officer of Hortonworks agrees saying that, “Together, we are well positioned to continue growing and competing in the streaming and IoT, data management, data warehousing, machine learning/AI and hybrid cloud markets.” The terms of the transaction agreement are: Cloudera stockholders will own approximately 60% of the equity of the combined company. Hortonworks stockholders will own approximately 40% of the equity of the combined company. Hortonworks stockholders will receive 1.305 common shares of Cloudera for each share of Hortonworks stock owned, which is based on the 10-day average exchange ratio of the two companies’ prices through October 1, 2018. The companies have a combined fully-diluted equity value of $5.2 billion based on closing prices on October 2, 2018. This merger is expected to generate significant financial benefits and improved margin profile for both the companies which includes: Approximately $720 million in revenue More than 2,500 customers More than 800 customers over $100,000 ARR More than 120 customers over $1 million ARR More than $125 million in annual cost synergies More than $150 million cash flow in CY20 Over $500 million cash, no debt Read more about the announcement on the Hortonworks blog. Hortonworks Data Platform 3.0 is now generally available. Hortonworks partner with Google Cloud to enhance their Big Data strategy. Cloudera Altus Analytic DB: Modernizing the cloud-based data warehouses.
Read more
  • 0
  • 0
  • 2524
article-image-cncf-accepts-cloud-native-buildpacks-to-the-cloud-native-sandbox
Sugandha Lahoti
04 Oct 2018
2 min read
Save for later

CNCF accepts Cloud Native Buildpacks to the Cloud Native Sandbox

Sugandha Lahoti
04 Oct 2018
2 min read
Yesterday, the Cloud Native Computing Foundation (CNCF) accepted Cloud Native Buildpacks (CNB) into the CNCF Sandbox. With this collaboration, Buildpacks will be able to leverage the vendor neutrality of CNCF to leverage cloud native virtues. The Cloud Native Buildpacks project was initiated by Pivotal and Heroku in January 2018. The project aims to unify the buildpack ecosystems with a platform-to-buildpack contract. This project incorporates learnings from maintaining production-grade buildpacks at both Pivotal and Heroku. What are Cloud Native Buildpacks? At the high level, Cloud Native Buildpacks turn source code into production ready Docker images that are OCI image compatible. This gives users more options to customize runtime while making their apps portable. Buildpacks minimize initial time to production thus reducing the operational burden on developers, and supports enterprise operators who manage apps at scale. Buildpacks were first created by Heroku in 2011. Since then, they have been adopted by Cloud Foundry as well as Gitlab, Knative, Microsoft, Dokku, and Drie. The Buildpack API was open sourced in 2012 with Heroku-specific elements removed. This was done to make sure that each vendor that adopted buildpacks evolved the API independently, which led to isolated ecosystems. As a part of the Cloud Native Sandbox project, the Buildpack API is standardized for all platforms. They are also opening the tooling they work with and will run buildpacks under the Buildpack GitHub organization. “Anyone can create a buildpack for any Linux-based technology and share it with the world. Buildpacks’ ease of use and flexibility are why millions of developers rely on them for their mission critical apps,” said Joe Kutner, architect at Heroku. “Cloud Native Buildpacks will bring these attributes inline with modern container standards, allowing developers to focus on their apps instead of their infrastructure.” Developers can start using Cloud Native Buildpacks by forking one of the Buildpack Samples. You can also read up on the implementation specifics laid out in the Buildpack API documentation. CNCF Sandbox, the home for evolving cloud native projects, accepts Google’s OpenMetrics Project. Google Cloud hands over Kubernetes project operations to CNCF, grants $9M in GCP credits. Cortex, an open source, horizontally scalable, multi-tenant Prometheus-as-a-service becomes a CNCF Sandbox project.
Read more
  • 0
  • 0
  • 4107

article-image-cloudflare-workers-kv-a-distributed-native-key-value-store-for-cloudflare-workers
Prasad Ramesh
01 Oct 2018
3 min read
Save for later

Cloudflare Workers KV, a distributed native key-value store for Cloudflare Workers

Prasad Ramesh
01 Oct 2018
3 min read
Cloudflare announced a fast distributed native key-value store for Cloudflare Workers on Friday. They are calling this “Cloudflare Workers KV”. Cloudflare Workers is a new kind of computing platform which is built on top of their global network of over 150 data centers. It allows writing serverless code which runs in the fabric of the internet itself. This allows engaging with users faster than other platforms. Cloudflare Workers KV is built on a new architecture which eliminates cold starts and dramatically reduced the memory overhead of keeping the code running. The values can also be written from within a Cloudflare Worker. Cloudflare handles synchronizing keys and values across the network. Cloudflare Workers KV features Developers can augment their existing applications or build a new application on Cloudflare’s network using Cloudflare Workers and Cloudflare Workers KV. Cloudflare Workers KV can scale to support applications serving dozens or even millions of users. Some of its features are as follows. Serverless storage Cloudflare created a serverless execution environment at each of their 153 data centers with Cloudflare Workers, but it still caused customers to manage their own storage. But with Cloudflare Workers KV, global application access to a key-value store is just an API call away. Responsive applications anywhere Serverless applications that run on Cloudflare Workers get low latency access to a globally distributed key-value store. Cloudflare Workers KV achieves a low latency by caching replicas of the keys and values stored in Cloudflare's cloud network. Build without scaling concerns Cloudflare Workers KV allows developers to focus their time on adding new capabilities to their serverless applications. They won’t have to waste time scaling their key-value stores. Key features of Cloudflare Workers KV The key features of Cloudflare workers KV as listed on their website are: Accessible from all 153 Cloudflare locations Supports values up to 64 KB Supports keys up to 2 KB Read and write from Cloudflare Workers An API to write to Workers KV from 3rd party applications Uses Cloudflare’s robust caching infrastructure Set arbitrary TTLs for values Integrates with Workers Preview It is currently in beta. To know more about workers KV, visit the Cloudflare Blog and the Cloudflare website. Bandwidth Alliance: Cloudflare collaborates with Microsoft, IBM and others for saving bandwidth Cloudflare’s decentralized vision of the web: InterPlanetary File System (IPFS) Gateway to create distributed websites Google introduces Cloud HSM beta hardware security module for crypto key security
Read more
  • 0
  • 0
  • 5530

article-image-bandwidth-alliance-cloudflare-collaborates-with-microsoft-ibm-and-others-for-saving-bandwidth
Prasad Ramesh
27 Sep 2018
2 min read
Save for later

Bandwidth Alliance: Cloudflare collaborates with Microsoft, IBM and others for saving bandwidth

Prasad Ramesh
27 Sep 2018
2 min read
Cloudflare, a content delivery network service provider, formed a new group yesterday called as the Bandwidth Alliance to reduce bandwidth cost of many cloud users. Cloudflare will provide heavy discounts or free services on bandwidth charges to organizations who are both Cloudflare customers and cloud providers part of this alliance. Current bandwidth charges Hosting on most cloud providers includes data transfer charges, known as bandwidth or egress charges. These charges include the cost of delivering traffic from the cloud to the consumer. However, while using a CDN like Cloudflare, the cost of data transfer is additional over the content delivery cost. This extra charge makes sense if the data has to cross thousands of miles where an infrastructure needs to be maintained across this distance. To do all this, there is a costing involved, which further gets added to customer’s final bill. The Bandwidth Alliance aims to eliminate these additional charges and provide more affordable cloud services. What is the bandwidth alliance? Traffic that is delivered to users through Cloudflare passes across a Private Network Interface (PNI). The PNI usually is within the same facility formed with a fiber optic cable between routers for the two networks. If there’s no transit provider, nor a middleman for maintaining infrastructure, there is no additional cost for Cloudflare or the cloud provider. Cloud service providers use the PNI’s to deeply interconnect with third party networks and Cloudflare. Cloudflare carries the traffic automatically from the user’s location to the Cloudflare data center nearest to the cloud provider then over the PNIs. Cloudflare has heavily peered networks allowing traffic to be carried over the free interconnected links. Thus, Cloudflare came up with Bandwidth Alliance to provide the mutual customers with lower costs. They teamed up with some cloud providers to see if they can make use of their huge interconnects to benefit the end customers. Some of the current members include Automattic, Backblaze, DigitalOcean, DreamHost, IBM Cloud, linode, Microsoft Azure, Packet, Scaleway, and Vapor. The alliance is open for inclusion of more cloud providers. You can read more in the official Cloudflare Blog. Cloudflare’s decentralized vision of the web: InterPlanetary File System (IPFS) Gateway to create distributed websites Microsoft Ignite 2018: New Azure announcements you need to know Google introduces Cloud HSM beta hardware security module for crypto key security
Read more
  • 0
  • 0
  • 5283
article-image-microsoft-ignite-2018-new-azure-announcements-you-need-to-know
Melisha Dsouza
25 Sep 2018
4 min read
Save for later

Microsoft Ignite 2018: New Azure announcements you need to know

Melisha Dsouza
25 Sep 2018
4 min read
If you missed the Azure announcements made at Microsoft Ignite 2018, don’t worry, we’ve got you covered. Here are some of the biggest changes and improvements the Microsoft Azure team have made to their cloud offering. Infrastructure Improvements Azure’s new capabilities to deliver the best infrastructure for every workload include: 1. GPU enable and High-Performance VM To deliver the best infrastructure for every workload, Azure has announced the Preview of GPU-enabled and High-Performance Computing Virtual Machines. The two new N-series Virtual Machines have NVIDIA GPU capabilities. The first one is the NVv2 VMs and the second virtual machine is the NDv2 VMs. The two new H-series VMs are optimized for performance and cost and are aimed at HPC workloads like fluid dynamics, structural mechanics, energy exploration, weather forecasting, risk analysis, and more. The first VM is the HB VMs and the second VM is the HC VMs. 2. Networking Azure has announced the general availability of Azure Firewall and Virtual WAN. They have also announced the preview of Azure Front Door Service, ExpressRoute Global Reach, and ExpressRoute Direct. Azure Firewall has a built-in high availability and cloud scalability. The Virtual WAN will provide a simple, unified, global connectivity, and security platform to deploy large-scale branch connectivity. 3. Improved Disk storage Microsoft has expanded the portfolio of Azure Disk offerings to deploy any app in Azure, including those that are the most IO intensive. The new previews include the Ultra SSDs, Standard SSDs, Larger managed disk sizes - to help deal with data-intensive workloads. This will also ensure better availability, reliability, and latency as compared to standard SSDs 4. Hybrid Microsoft has announced new hybrid capabilities to manage data, create even more consistency, and secure hybrid environment. They have introduced the Azure Data Box edge, Windows Server 2019 and Azure stack. With AI enable edge computing capabilities, and OS that supports hybrid management and flexibility for deploying applications, Azure is causing waves in the developer community Built-in security & management For improved Security, Azure has announced new services for preview, like Confidential Computing DC VM series, Secure score, improved threat protection, and network map (preview). These will expand Azure security controls and services to protect network, applications, data, and identities. These services are enhanced by the unique intelligence that comes from the trillions of signals we collect in running first party services like Office 365 and Xbox. For better Management, Azure has announced the preview of Azure Blueprints. These blueprints make it easy to deploy and update Azure environments in a repeatable manner using composable artifacts such as policies, role-based access controls, and resource templates. Azure cost management in the Azure portal (preview) will help to access cost management from PowerBI or directly from your own custom applications. Migration To make the migration to the cloud less challenging, Azure has announced the support for Hyper-V assessments in Azure Migrate, Azure SQL Database Managed Instance, which enables users to migrate SQL Servers to a fully managed Azure service. To help improve your migration experience, we are announcing that if you migrate Windows Server or SQL Server 2008/R2 to Azure, you will get three years of free extended security updates on those systems. This could save you some money when Windows Server and SQL Server 2008/ R2 end of support (EOS). Automated ML capability in Azure Machine Learning The problem of finding the best machine learning pipeline for a given dataset scales faster than the time available for data science projects.  Azure’s Automated machine learning enables developers to access an automated service that identifies the best machine learning pipelines for their labelled data. Data scientists are empowered with a powerful productivity tool that also takes uncertainty into account, incorporating a probabilistic model to determine the best pipeline to try next. To follow more of the Azure buzz, head to  Microsoft’s official Blog   Microsoft’s Immutable storage for Azure Storage Blobs, now generally available Azure Functions 2.0 launches with better workload support for serverless Microsoft announces Azure DevOps, makes Azure pipelines available on GitHub Marketplace  
Read more
  • 0
  • 0
  • 5453

article-image-azure-functions-2-0-launches-with-better-workload-support-for-serverless
Melisha Dsouza
25 Sep 2018
2 min read
Save for later

Azure Functions 2.0 launches with better workload support for serverless

Melisha Dsouza
25 Sep 2018
2 min read
Microsoft  has announced the general availability of Azure Functions 2.0. The new release aims to handle demanding workloads, which should make managing the scale of serverless applications easier than ever before. With an improved user experience, and new developer capabilities, the release is evidence of Microsoft looking to take full advantage of interest in serverless computing. New features in Azure Functions 2.0 Azure Functions can now run on more platforms Azure Functions are now supported on more environments, including local Mac or Linux machines. An integration with its VS Code will help developers have a best-in-class serverless development experience on any platform. Code optimizations Functions 2.0 has added general host improvements, support for more modern language runtimes, and the ability to run code from a package file. .NET developers can now author functions using .NET Core 2.1.  This provides a significant performance gain and helps to develop and run .NET functions in more places. Assembly resolution functions have been improved to reduce the number of conflicts. Functions 2.0 now supports both Node 8 and Node 10, with improved performance in general. A powerful new programming model Bindings and integrations of Functions 1.0 have been improvised in functions 2.0. All bindings are brought in as extensions. The change to decoupled extension packages allows bindings (and their dependencies) to be versioned without depending on the core runtime. The recent launch of Azure SignalR Service, a fully managed service, enables focus on building real-time web experiences without worrying about setting up, hosting, scaling, or load balancing the SignalR server. Find an extension for this service, in this GitHub repo. Check out the SignalR Service binding reference to start building real-time serverless applications. Easier development To improve productivity, Microsoft has introduced a powerful native tooling inside of Visual Studio, VS Code, VS for Mac, and a CLI that can be run alongside any code editing experience. In Functions 2.0, more visibility is given to distributed tracing. Dependencies are automatically tracked, and cross-resource connections are automatically correlated across a variety of services To know more about the updates in Azure Functions 2.0  head to Microsoft’s official Blog Microsoft’s Immutable storage for Azure Storage Blobs, now generally available Why did last week’s Azure cloud outage happen? Here’s Microsoft’s Root Cause Analysis Summary.
Read more
  • 0
  • 0
  • 4830