Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News

3711 Articles
article-image-telcos-move-from-black-boxes-to-open-source-from-linux-com
Matthew Emerick
07 Oct 2020
12 min read
Save for later

Telcos Move from Black boxes to Open Source from Linux.com

Matthew Emerick
07 Oct 2020
12 min read
Linux Foundation Networking (LFN) organized its first virtual event last week and we sat down with Arpit Joshipura, the General Manager of Networking, IoT and Edge at the Linux Foundation, to talk about the key points of the event and how LFN is leading the adoption of open source within the telco space.  Swapnil Bhartiya: Today, we have with us Arpit Joshipura, General Manager of Networking, IoT and Edge, at the Linux Foundation. Arpit, what were some of the highlights of this event? Some big announcements that you can talk about? Arpit Joshipura: This was a global event with more than 80 sessions and was attended by attendees from over 75 countries. The sessions were very diverse. A lot of the sessions were end-user driven, operator driven as well as from our vendors and partners. If you take LF Networking and LFH as two umbrellas that are leading the Networking and Edge implementations here, we had a very significant announcement. I would probably group them into 5 main things: Number one, we released a white paper at the Linux Foundation level where we had a bunch of vertical industries transformed using open source. These are over 100-year-old industries like telecom, automotive, finance, energy, healthcare, etc. So, that’s kind of one big announcement where vertical industries have taken advantage of open source. The second announcement was easy enough: Google Cloud joins Linux Foundation Networking as a partner. That announcement comes on the basis of the telecom market and the cloud market converging together and building on each other. The third major announcement was a project under LF Networking. If you remember, two years ago, a project collaboration with GSMA was started. It was called CNTT, which really defined and narrowed the scope of interoperability and compliance. And we have OPNFV under LFN. What we announced at Open Networking and Edge summit is the two projects are going to come together. This would be fantastic to a global community of operators who are simplifying the deployment and interoperability of implementation of NFVI manual VNFs and CNFs. The next announcement was around a research study that we released on open source code that was created by Linux Foundation Networking, using LFN analytics and COCOMO estimation. We’re talking $7.2 billion worth of IP investment, right? This is the power of shared technology. And finally, we released a survey on the Edge community asking them, “Why are you contributing to open source?” And the answer was fascinating. It was all-around innovation, speed to deployment, market creation. Yes, cost was important, but not initially. So those were the 5 big highlights of the show from an LFN and LFH perspective. Swapnil Bhartiya: There are two things that I’m interested in. One is the consolidation that you talk about, and the second is survey. The fact is that everybody is using open source. There is no doubt about it. But the problem that is happening is since everybody’s using it, there seems to be some gap between the awareness of how to be a good open source citizen as well. What have you seen in the telco space? Arpit Joshipura: First of all, 5 years ago, they were all using black box and proprietary technologies. Then, we launched a project called OpenDaylight. And of course, OpenDaylight announced its 13th release today, but that’s kind of on their 6-year anniversary from being proprietary to today in one of the more active projects called ONAP. The telcos are 4 of the Top 10 contributors of source code and open source, right? Who would have imagined that an AT&T, Verizon, Amdocs, DT, Vodafone, and a China mobile and a China telecom, you name it are all actively contributing code? So that’s a paradigm shift in terms of not only consuming it, but also contributing towards it. Swapnil Bhartiya: And since you mentioned ONAP, if I’m not wrong, I think AT&T released its own work as E-com. And then the projects within the Foundation were merged to create ONAP. And then you mentioned actually NTD. So, what I want to understand from you is how many projects are there that you see within the Foundation? The problem is that Linux Foundation and all those other foundations are open servers. It’s a very good place for those products to come in. It’s obvious that there will be some projects that will overlap. So what is the situation right now? Where do you see some overlap happening and, at the same time, are there still gaps that you need to fill? Arpit Joshipura: So that’s a question of the philosophies of a foundation, right? I’ll start off with the most loose situation, which is GitHub. Millions and millions of projects on GitHub. Any PhD student can throw his code on GitHub and say that’s open source and at the end of the day, if there’s no community around it, that project is dead. Okay. That’s the most extreme scenario. Then, there are foundations like CNCF who have a process of accepting projects that could have competing solutions. May the best project win. From an LF Networking and LFH perspective, the process is a little bit more restrictive: there is a formal project life cycle document and a process available on the Wiki that looks at the complementary nature of the project, that looks at the ecosystem, that looks at how it will enable and foster innovation. Then based on that, the governing board and the neutral governance that we have set up under the Linux Foundation, they would approve it. Overall, it depends on the philosophy for LFN and LFH. We have 8 projects each in the umbrella, and most of these projects are quite complementary when it comes to solving different use cases in different parts of the network. Swapnil Bhartiya: Awesome. Now, I want to talk about 5G a bit. I did not hear any announcements, but can you talk a bit about what is the word going on to help the further deployment of 5G technologies? Arpit Joshipura: Yeah. I’m happy and sad to say that 5G is old news, right? The reality is all of the infrastructure work on 5G already was released earlier this year. So ONAP Frankfurt release, for example, has a blueprint on 5G slicing, right? All the work has been done, lots of blueprint and Akraino using 5G and mech. So, that work is done. The cities are getting lit up by the carriers. You see announcements from global carriers on 5G deployments. I think there are 2 missing pieces of work remaining for 5G. One is obviously the O-RAN support, right? The O-RAN software community, which we host at the Linux Foundation also is coming up with a second release. And, all the support for 5G is in there. The second part of 5G is really the compliance and verification testing. A lot of work is going into CMTT and OPN and feed. Remember that merge project we talked about where 5G is in context of not just OpenStack, but also Kubernetes? So the cloud-native aspects of 5G are all being worked on this year. I think we’ll see a lot more cloud-native 5G deployments next year primarily because projects like ONAP or cloud native integrate with projects like ONAP and Anthos or Azure stack and things like that. Swapnil Bhartiya: What are some of the biggest challenges that the telco industry is facing? I mean, technically, no externalization and all those things were there, but foundations have solved the problem. Some rough ideas are still there that you’re trying to resolve for them. Arpit Joshipura: Yeah. I think the recent pandemic caused a significant change in the telcos’ thinking, right? Fortunately, because they had already started on a virtualization and open-source route, you heard from Android, and you heard from Deutsche Telekom, and you heard from Achronix, all of the operators were able to handle the change in the network traffic, change in the network, traffic direction, SLS workloads, etc., right? All because of the softwarization as we call it on the network. Given the pandemic, I think the first challenge for them was, can the network hold up? And the answer is, yes. Right? All the work-from-home and all these video recordings, we have to hang out with the web, that was number one. Number two is it’s good to hold up the network, but did I end up spending millions and millions of dollars for operational expenditures? And the answer to that is no, especially for the telcos who have embraced an open-source ecosystem, right? So people who have deployed projects like SDN or ONAP or automation and orchestration or closed-loop controls, they automatically configure and reconfigure based on workloads and services and traffic, right? And that does not require manual labor, right? Tremendous amounts of costs were saved from an opex perspective, right? For operators who are still in the old mindset have significantly increased their opex, and what that has caused is a real strain on their budget sheets. So those were the 2 big things that we felt were challenges, but have been solved. Going forward, now it’s just a quick rollout/build-out of 5G, expanding 5G to Edge, and then partnering with the public cloud providers, at least, here in the US to bring the cloud-native solutions to market. Swapnil Bhartiya: Awesome. Now, Arpit, if I’m not wrong, LF Edge is I think, going to celebrate its second anniversary in January. What do you feel the product has achieved so far? What are its accomplishments? And what are some challenges that the project still has to tackle? Arpit Joshipura: Let me start off with the most important accomplishment as a community and that is terminology. We have a project called State of the Edge and we just issued a white paper, which outlines terminology, terms and definitions of what Edge is because, historically, people use terms like thin edge, thick edge, cloud edge, far edge, near edge and blah, blah, blah. They’re all relative terms. Okay. It’s an edge in relation to who I am. Instead of that, the paper now defines absolute terms. If I give you a quick example, there are really 2 kinds of edges. There’s a device edge, and then there is a service provider edge. A device edge is really controlled by the operator, by the end user, I should say. Service provider edge is really shared as a service and the last mile typically separates them. Now, if you double click on each of these categories, then you have several incarnations of an edge. You can have an extremely constrained edge, microcontrollers, etc., mostly manufacturing, IIoT type. You could have a smart device edge like gateways, etc. Or you could have an on-prem silver type device edge. Either way, an end user controls that edge versus the other edge. Whether it’s on the radio-based stations or in a smart central office, the operator controls it. So that’s kind of the first accomplishment, right? Standardizing on terminology. The second big Edge accomplishment is around 2 projects: Akraino and EdgeX Foundry. These are stage 3 mature projects. They have come out with significant [results]. Akraino, for example, has come out with 20 plus blueprints. These are blueprints that actually can be deployed today, right? Just to refresh, a blueprint is a declarative configuration that has everything from end to end to solve a particular use case. So things like connected classrooms, AR/VR, connected cars, right? Network cloud, smart factories, smart cities, etc. So all these are available today. EdgeX is the IoT framework for an industrial setup, and that’s kind of the most downloaded. Those 2 projects, along with Fledge, EVE, Baetyl, Home Edge, Open Horizon, security advanced onboarding, NSoT, right? Very, very strong growth over 200% growth in terms of contributions. Huge growth in membership, huge growth in new projects and the community overall. We’re seeing that Edge is really picking up great. Remember, I told you Edge is 4 times the size of the cloud. So, everybody is in it. Swapnil Bhartiya: Now, the second part of the question was also some of the challenges that are still there. You talked about accomplishment. What are the problems that you see that you still think that the project has to solve for the industry and the community? Arpit Joshipura: The fundamental challenge that remains is we’re still working as a community in different markets. I think the vendor ecosystem is trying to figure out who is the customer and who is the provider, right? Think of it this way: a carrier, for example, AT&T, could be a provider to a manufacturing factory, who actually could consume something from a provider, and then ship it to an end user. So, there’s like a value shift, if you may, in the business world, on who gets the cut, if you may. That’s still a challenge. People are trying to figure out, I think people who are going to be quick to define, solve and implement solutions using open technology will probably turn out to be winners. People who will just do analysis per analysis will be left behind like any other industry. I think that is kind of fundamentally number one. And number two, I think the speed at which we want to solve things. The pandemic has just accelerated the need for Edge and 5G. I think people are just eager to get gaming with low latency, get manufacturing, predictive maintenance with low latency, home surveillance with low latency, connected cars, autonomous driving, all the classroom use cases. They should have been done next year, but because of the pandemic, it just got accelerated. The post Telcos Move from Black boxes to Open Source appeared first on Linux.com.
Read more
  • 0
  • 0
  • 691

article-image-announcing-improved-mdx-query-performance-in-power-bi-from-microsoft-power-bi-blog-microsoft-power-bi
Matthew Emerick
07 Oct 2020
1 min read
Save for later

Announcing improved MDX query performance in Power BI from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
07 Oct 2020
1 min read
With a recent update of the Analysis Services Tabular engine in Power BI, Multidimensional Expressions (MDX) clients, such as Microsoft Excel, can now enjoy improved query performance against datasets in Power BI.
Read more
  • 0
  • 0
  • 913

article-image-three-ways-serverless-apis-can-accelerate-enterprise-innovation-from-microsoft-azure-blog-announcements
Matthew Emerick
07 Oct 2020
5 min read
Save for later

Three ways serverless APIs can accelerate enterprise innovation from Microsoft Azure Blog > Announcements

Matthew Emerick
07 Oct 2020
5 min read
With the wrong architecture, APIs can be a bottleneck to not only your applications but to your entire business. Bottlenecks such as downtime, low performance, or high application complexity, can result in exaggerated infrastructure and organizational costs and lost revenue. Serverless APIs mitigate these bottlenecks with autoscaling capabilities and consumption-based pricing models. Once you start thinking of serverless as not only a remover-of-bottlenecks but also as an enabler-of-business, layers of your application infrastructure become a source of new opportunities. This is especially true of the API layer, as APIs can be productized to scale your business, attract new customers, or offer new services to existing customers, in addition to its traditional role as the communicator between software services. Given the increasing dominance of APIs and API-first architectures, companies and developers are gravitating towards serverless platforms to host APIs and API-first applications to realize these benefits. One serverless compute option to host API’s is Azure Functions, event-triggered code that can scale on-demand, and you only pay for what you use. Gartner predicts that 50 percent of global enterprises will have deployed a serverless functions platform by 2025, up from only 20 percent today. You can publish Azure Functions through API Management to secure, transform, maintain, and monitor your serverless APIs. Faster time to market Modernizing your application stack to run microservices on a serverless platform decreases internal complexity and reduces the time it takes to develop new features or products. Each serverless function implements a microservice. By adding many functions to a single API Management product, you can build those microservices into an integrated distributed application. Once the application is built, you can use API Management policies to implement caching or ensure security requirements. Quest Software uses Azure App Service to host microservices in Azure Functions. These support user capabilities such as registering new tenants and application functionality like communicating with other microservices or other Azure platform resources such as the Azure Cosmos DB managed NoSQL database service. “We’re taking advantage of technology built by Microsoft and released within Azure in order to go to market faster than we could on our own. On average, over the last three years of consuming Azure services, we’ve been able to get new capabilities to market 66 percent faster than we could in the past.” - Michael Tweddle, President and General Manager of Platform Management, Quest Quest also uses Azure API Management as an serverless API gateway for the Quest On Demand microservices that implement business logic with Azure Functions and to apply policies that control access, traffic, and security across microservices. Modernize your infrastructure Developers should be focusing on developing applications, not provisioning and managing infrastructure. API management provides a serverless API gateway that delivers a centralized, fully managed entry point for serverless backend services. It enables developers to publish, manage, secure, and analyze APIs on at global scale. Using serverless functions and API gateways together allows organizations to better optimize resources and stay focused on innovation. For example, a serverless function provides an API through which restaurants can adjust their local menus if they run out of an item. Chipotle turned to Azure to create a unified web experience from scratch, leveraging both Azure API Management and Azure Functions for critical parts of their infrastructure. Calls to back-end services (such as ordering, delivery, and account management and preferences) hit Azure API Management, which gives Chipotle a single, easily managed endpoint and API gateway into its various back-end services and systems. With such functionality, other development teams at Chipotle are able to work on modernizing the back-end services behind the gateway in a way that remains transparent to Smith’s front-end app. “API Management is great for ensuring consistency with our API interactions, enabling us to always know what exists where, behind a single URL,” says Smith. “There are lots of changes going on behind the API gateway, but we don’t need to worry about them.”- Mike Smith, Lead Software Developer, Chipotle Innovate with APIs Serverless APIs are used to either increase revenue, decrease cost, or improve business agility. As a result, technology becomes a key driver of business growth. Businesses can leverage artificial intelligence to analyze API calls to recognize patterns and predict future purchase behavior, thus optimizing the entire sales cycle. PwC AI turned to Azure Functions to create a scalable API for its regulatory obligation knowledge mining solution. It also uses Azure Cognitive Search to quickly surface predictions found by the solution, embedding years of experience into an AI model that easily identifies regulatory obligations within the text. “As we’re about to launch our ROI POC, I can see that Azure Functions is a value-add that saves us two to four weeks of work. It takes care of handling prediction requests for me. I also use it to extend the model to other PwC teams and clients. That’s how we can productionize our work with relative ease.”- Todd Morrill, PwC Machine Learning Scientist-Manager, PwC Quest Software, Chipotle, and PwC are just a few Microsoft Azure customers who are leveraging tools such as Azure Functions and Azure API Management to create an API architecture that ensures your API’s are monitored, managed, and secure. Rethinking your API approach to use serverless technologies will unlock new capabilities within your organization that are not limited by scale, cost, or operational resources. Get started immediately Learn about common serverless API architecture patterns at the Azure Architecture Center, where we provide high-level overviews and reference architectures for common patterns that leverage Azure Functions and Azure API Management, in addition to other Azure services. Reference architecture for a web application with a serverless API. 
Read more
  • 0
  • 0
  • 2701

article-image-how-oracle-v-google-could-upend-software-development-from-infoworld-java
Matthew Emerick
07 Oct 2020
1 min read
Save for later

How Oracle v. Google could upend software development from InfoWorld Java

Matthew Emerick
07 Oct 2020
1 min read
Oracle v. Google has been winding its way through courts for a decade. You’ve probably already heard that the high-profile legal case could transform software engineering as we know it — but since nothing ever seems to happen, it’s forgivable if you’ve made a habit of tuning out the news. It might be time to tune back in. The latest iteration of the case will be heard by the U.S. Supreme Court in the 2020-2021 season, which began this week (after being pushed back due to coronavirus concerns). The decision of the highest court in the land can’t be overturned and is unlikely to be reversed, so unlike previous decisions at the district and circuit court level, it would stick for good. And while the case is being heard in the U.S., the decision would impact the entire global tech industry. To read this article in full, please click here
Read more
  • 0
  • 0
  • 1147
Banner background image

article-image-optimize-your-azure-workloads-with-azure-advisor-score-from-microsoft-azure-blog-announcements
Matthew Emerick
07 Oct 2020
3 min read
Save for later

Optimize your Azure workloads with Azure Advisor Score from Microsoft Azure Blog > Announcements

Matthew Emerick
07 Oct 2020
3 min read
Modern engineering practices, like Agile and DevOps, are redirecting the ownership of security, operations, and cost management from centralized teams to workload owners—catalyzing innovations at a higher velocity than in traditional data centers. In this new world, workload owners are expected to build, deploy, and manage cloud workloads that are secure, reliable, performant, and cost-effective. If you’re a workload owner, you want well-architected deployments, so you might be wondering, how well are you doing today? Of all the actions you can take, which ones will make the biggest difference for your Azure workloads? And how will you know if you’re making progress? That’s why we created Azure Advisor Score—to help you understand how well your Azure workloads are following best practices, assess how much you stand to gain by remediating issues, and prioritize the most impactful recommendations you can take to optimize your deployments. Introducing Advisor Score Advisor Score enables you to get the most out of your Azure investment using a centralized dashboard to monitor and work towards optimizing the cost, security, reliability, operational excellence, and performance of your Azure resources. Advisor Score will help you: Assess how well you’re following the best practices defined by Azure Advisor and the Microsoft Azure Well-Architected Framework. Optimize your deployments by taking the most impactful actions first. Report on your well-architected progress over time. Baselining is one great use case we’ve already seen with customers. You can use Advisor Score to baseline yourself and track your progress over time toward your goals by reviewing your score’s daily, weekly, or monthly trends. Then, to reach your goals, you can take action first on the individual recommendations and resources with the most impact. How Advisor Score works Advisor Score measures how well you’re adopting Azure best practices, comparing and quantifying the impact of the Advisor recommendations you’re already following, and the ones you haven’t implemented yet. Think of it as a gap analysis for your deployed Azure workloads. The overall score is calculated on a scale from 0 percent to 100 percent both in aggregate and separately for cost, security (coming soon), reliability, operational excellence, and performance. A score of 100 percent means all your resources follow all the best practices recommended in Advisor. On the other end of the spectrum, a score of zero percent means that none of your resources follow the recommended best practices. Advisor Score weighs all resources, both those with and without active recommendations, by their individual cost relative to your total spend. This builds on the assumption that the resources which consume a greater share of your total investment in Azure are more critical to your workloads. Advisor Score also adds weight to resources with longstanding recommendations. The idea is that the accumulated impact of these recommendations grows the longer they go unaddressed. Review your Advisor Score today Check your Advisor Score today by visiting Azure Advisor in the Azure portal. To learn more about the model behind Advisor Score and see examples of how the score is calculated, review our Advisor Score documentation, and this behind-the-scenes blog from our data science team about the development of Advisor Score.
Read more
  • 0
  • 0
  • 1795

article-image-lower-prices-and-more-flexible-purchase-options-for-azure-red-hat-openshift-from-microsoft-azure-blog-announcements
Matthew Emerick
07 Oct 2020
4 min read
Save for later

Lower prices and more flexible purchase options for Azure Red Hat OpenShift from Microsoft Azure Blog > Announcements

Matthew Emerick
07 Oct 2020
4 min read
For the past several years, Microsoft and Red Hat have worked together to co-develop hybrid cloud solutions intended to enable greater customer innovation. In 2019, we launched Azure Red Hat OpenShift as a fully managed, jointly engineered implementation of Red Hat OpenShift running on Red Hat OpenShift 3.11 that is deeply integrated into the Azure control plane. With the release of Red Hat OpenShift 4, we announced the general availability of Azure Red Hat OpenShift on OpenShift 4 in April 2020. Today we’re sharing that in collaboration with Red Hat, we are dropping the price of Red Hat OpenShift licenses on Azure Red Hat OpenShift worker nodes by up to 77 percent. We’re also adding the choice of a three-year term for Reserved Instances (RIs) on top of the existing one year RI and pay as you go options, with a reduction in the minimum number of virtual machines required. The new pricing is effective immediately. Finally, as part of the ongoing improvements, we are increasing the Service Level Agreement (SLA) to be 99.95 percent. With these new price reductions, Azure Red Hat OpenShift provides even more value with a fully managed, highly-available enterprise Kubernetes offering that manages the upgrades, patches, and integration for the components that are required to make a platform. This allows your teams to focus on building business value, not operating technology platforms. How can Red Hat OpenShift help you? As a developer Kubernetes was built for the needs of IT Operations, not developers. Red Hat OpenShift is designed so developers can deploy apps on Kubernetes without needing to learn Kubernetes. With built-in Continuous Integration (CI) and Continuous Delivery (CD) pipelines, you can code and push to a repository and have your application up and running in minutes. Azure Red Hat OpenShift includes everything you need to manage your development lifecycle; standardized workflows, support for multiple environments, continuous integration, release management, and more. Also included is the provision self-service, on-demand application stacks, and deploy solutions from the Developer Catalog such as OpenShift Service Mesh, OpenShift Serverless, Knative, and more. Red Hat OpenShift provides commercial support for the languages, databases, and tooling you already use, while providing easy access to Azure services such as Azure Database for PostgreSQL and Azure Cosmos DB, to enable you create resilient and scalable cloud native applications. As an IT operator Adopting a container platform lets you keep up with application scale and complexity requirements. Azure Red Hat OpenShift is designed to make deploying and managing the container platform easier, with automated maintenance operations and upgrades built right in, integrated platform monitoring—including Azure Monitor for Containers, and a support experience directly from the Azure support portal. With Azure Red Hat OpenShift, your developers can be up and running in minutes. You can scale on your terms, from ten containers to thousands, and only pay for what you need. With one-click updates for platform, services, and applications, Azure Red Hat OpenShift monitors security throughout the software supply chain to make applications more stable without reducing developer productivity. You can also leverage built-in vulnerability assessment and management tools in Azure Security Center to scan images that are pushed to, imported, or pulled from an Azure Container Registry. Discover Operators from the Kubernetes community and Red Hat partners, curated by Red Hat. You can install Operators on your clusters to provide optional add-ons and shared services to your developers, such as AI and machine learning, application runtimes, data, document stores, monitoring logging and insights, security, and messaging services. Regional availability Azure Red Hat OpenShift is available in 27 regions worldwide, and we’re continuing to expand that list. Over the past few months, we have added support for Azure Red Hat OpenShift in a number of regions, including West US, Central US, North Central US, Canada Central, Canada East, Brazil South, UK West, Norway East, France Central, Germany West Central, Central India, UAE North, Korea Central, East Asia, and Japan East. Industry compliance certifications To help you meet your compliance obligations across regulated industries and markets worldwide, Azure Red Hat OpenShift is PCI DSS, FedRAMP High, SOC 1/2/3, ISO 27001 and HITRUST certified. Azure maintains the largest compliance portfolio in the industry, both in terms of the total number of offerings and also the number of customer-facing services in assessment scope. For more details, check the Microsoft Azure Compliance Offerings, as well as the number of customer-facing services in the assessment scope. Next steps Try Azure Red Hat OpenShift now. We are excited about these new lower prices and how this helps our customers build their business on a platform that enables IT operations and developers to collaborate effectively, develop, and deploy containerized applications rapidly with strong security capabilities.
Read more
  • 0
  • 0
  • 1900
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-how-to-write-to-a-file-in-python-txt-docx-csv-and-more-from-android-development-android-authority
Matthew Emerick
07 Oct 2020
4 min read
Save for later

How to write to a file in Python — Txt, Docx, CSV, and more! from Android Development – Android Authority

Matthew Emerick
07 Oct 2020
4 min read
Writing to files is one of the most important things you will learn in any new programming language. This allows you to save user data for future reference, to manipulate large data sets, or to build useful tools like word processors and spreadsheets. Let’s find out how to write to a file in Python! How to write to a file in Python – .txt files The simplest way to write to a file in Python is to create a new text file. This will allow you to store any string to retrieve later. To do this, you first open the file, then add the content you want, and then close the file to finish. my_file = open("NewFile.txt", "w+") my_File.write("Hello World!") my_file.close() In this example, we have opened a new file, written the words “Hello World!” and then closed the file. The “w+” tells Python that we are writing to a new file. If the file already exists, then that file is overwritten. If the file doesn’t already exist, then it will be created. But what if you want to append (add) to a file that already exists? In this case, you simply swap the “w+” for an “a+”. You can learn more useful tricks in a previous article: How to create a file in Python and more! This will show you how to delete and move files too! To display the contents of the file, just use the following two lines: my_file = open("NewFile.txt", "r") file_contents == my_file.read() How to write to other types of file But what if you have another type of file you want to work with, other than a text file? What if you want to create a new spreadsheet file? Or a new Word document? In many cases, you simply need to learn the formatting used by a particular file-type and then emulate this. For example, CSV files are used to store spreadsheets. The name “CSV” actually refers to the way this formatting works: “Comma-Separated Values.” In short, each line represents a row in a database and contains a series of values separated by commas. Each comma represents the start of a new column or cell! You can, therefore, save a bunch of data using the exact same method you used to create your text file, but ensure to insert commas and new-lines in the right place. If you then save the file as “.CSV” then it will open in Excel when you click on it! The same goes for many other types of file. For example, you could create a HTML file this way by using triangular tags to define headers, bold text, and other basic formatting! Many developers will create their own formats for storing data specific to their creations. Now you know how to write to a file in Python regardless of the type of file! Learn more about CSV files in Python here: How to open CSV files in Python: store and retrieve large data sets How to write to a file in Python with modules Of course, some files are going to contain more complex formatting than others. For example, if you want to write a .Doc file in Python, you’ll come unstuck! Open a Word document in a text editor and you’ll see that Microsoft uses a lot of confusing formatting and annotation to define the layout and add additional information. This is where modules come in! First, install the module you want via pip. You can do this by using the following command: pip install python doc-x If you are running from a command line in Windows, try: python –m pip install doc-x Now in your Python code you can do the following: import docx my_doc = docx.Document() my_doc.add_paragraph("Hello World!") my_doc.save("D:/NewHelloDoc.docx") This will write “Hello World!” to a document and then close it! You can also do some other, more complex formatting: my_doc.add_heading("Header 1", 0) my_doc.add_heading("Header 2", 1) my_doc.add_heading("Header 3", 2) my_doc.add_picture("D:/MyPicture.jpg", width=docx.shared.Inches(5), height=docx.shared.Inches(7)) Regardless of the type of file you want to work with, you’ll almost always find a module that can handle it for you. These are usually free to use and come with documentation you can route through! That’s just another of the amazing things about coding in Python! And that is how to write to a file in Python! If you’re enjoying learning Python, then why not take your education to the next level? We’ve compiled a list of the best online Python courses where you can find some amazing discounts. Check it out!
Read more
  • 0
  • 0
  • 1590

article-image-game-development-with-net-from-net-blog
Matthew Emerick
06 Oct 2020
5 min read
Save for later

Game Development with .NET from .NET Blog

Matthew Emerick
06 Oct 2020
5 min read
We’ve launched a new Game Development with .NET section on our site. It’s designed for current .NET developers to explore all the choices available to them when developing games. It’s also designed for new developers trying to learn how to use .NET by making games. We’ve also launched a new game development Learn portal for .NET filled with tutorials, videos, and documentation provided by Microsoft and others in the .NET game development community. Finally, we launched a step-by-step Unity get-started tutorial that will get you started with Unity and writing C# scripts for it in no time. We are excited to show you what .NET has to offer to you when making games. .NET is also part of Microsoft Game Stack, a comprehensive suite of tools and services just for game development. .NET for game developers .NET is cross-platform. With .NET you can target over 25+ different platforms with a single code base. You can make games for, but not limited to, Windows, macOS, Linux, Android, iOS, Xbox, PlayStation, Nintendo, and mixed reality devices. C# is the most popular programming language in game development. The wider .NET community is also big. There is no lack of expertise and support you can find from individuals and user groups, locally or online. .NET does not just cover building your game. You can also use it to build your game’s website with ASP.NET, your mobile app using Xamarin, and even do remote rendering with Microsoft Azure. Your skills will transfer across the entire game development pipeline. Available game engines The first step to developing games in .NET is to choose a game engine. You can think of engines as the frameworks and tools you use for developing your game. There are many game engines that use .NET and they differ widely. Some of the engines are commercial and some are completely royalty free and open source. I am excited to see some of them planning to adopt .NET 5 soon. Just choose the engine that better works for you and your game. Would you like to read a blog post to help you learn about .NET game engines, and which one would be best for you? Online services for your game If you’re building your game with .NET, then you have many choices on how to build your online game services. You can use ready-to-use services like Microsoft Azure PlayFab. You can also build from scratch on Microsoft Azure. .NET also runs on multiple operating systems, clouds, and services, it doesn’t limit you to use Microsoft’s platforms. .NET has a rich set of tools All the .NET tools you are used to also work when making games. Visual Studio is a great IDE that works with all .NET game engines on Windows and macOS. It provides word-class debugging, AI-assisted code completion, code refactoring, and cleanup. In addition, it provides real-time collaboration and productivity tools for remote work. GitHub also provides all your DevOps needs. Host and review code, manage projects, and build software alongside 50 million developers with GitHub. The ecosystem The .NET game development ecosystem is rich. Some of the .NET game engines depend on foundational work done by the open-source community to create managed graphics APIs like SharpDX, SharpVulkan, Vulkan.NET, and Veldrid. Xamarin also enables using platform native features on iOS and Android. Beyond the .NET community, each game engine also has their own community and user groups you can join and interact with. .NET is an open-source platform with over 60,000+ contributors. It’s free and a solid stable base for all your current and future game development needs. Learn more and start developing Head to our new Game Development with .NET site to get an overview of what .NET provides for you when making games. If you never used Unity, get started with our step-by-step Unity get-started tutorial and script with C# as quick as possible. If you’re looking for tutorials, videos, and documentations to get your started, head to our new game development Learn portal for .NET for more resources. Show us the work you do for .NET game development We’d love to see the work you do for the .NET game developer. Please reach out to us if you’d like us to talk about the games you’re making, the APIs you’re developing, the plug-ins you’re distributing, or any .NET project remotely related to game development. Did you write a great blog post, or just read one? Do you want everyone to know about an amazing new .NET game development contribution or a useful plug-in or tool? Do you have an analysis of game development pipeline using .NET? We’d love to hear from you, and feature your contributions on future posts: Leave us a message in the comments section below Send Abdullah (@indiesaudi) tips on Twitter about .NET game development. The post Game Development with .NET appeared first on .NET Blog.
Read more
  • 0
  • 0
  • 1418

article-image-new-training-course-from-continuous-delivery-foundation-helps-gain-expertise-with-jenkins-ci-cd-from-linux-com
Matthew Emerick
06 Oct 2020
1 min read
Save for later

New Training Course from Continuous Delivery Foundation Helps Gain Expertise with Jenkins CI/CD from Linux.com

Matthew Emerick
06 Oct 2020
1 min read
The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the availability of a new training course, LFS267 – Jenkins Essentials. LFS267, developed in conjunction with the Continuous Delivery Foundation, is designed for DevOps engineers, Quality Assurance personnel, SREs as well as software developers and architects who want to gain expertise with Jenkins for their continuous integration (CI) and continuous delivery (CD) activities. Source: Linux Foundation Training The post New Training Course from Continuous Delivery Foundation Helps Gain Expertise with Jenkins CI/CD appeared first on Linux.com.
Read more
  • 0
  • 0
  • 830

article-image-quantum-networks-the-next-generation-of-secure-computing-from-linux-com
Matthew Emerick
06 Oct 2020
1 min read
Save for later

Quantum networks: The next generation of secure computing from Linux.com

Matthew Emerick
06 Oct 2020
1 min read
Click to Read More at Enable Sysadmin The post Quantum networks: The next generation of secure computing appeared first on Linux.com.
Read more
  • 0
  • 0
  • 725
article-image-microsoft-partners-expand-the-range-of-mission-critical-applications-you-can-run-on-azure-from-microsoft-azure-blog-announcements
Matthew Emerick
06 Oct 2020
14 min read
Save for later

Microsoft partners expand the range of mission-critical applications you can run on Azure from Microsoft Azure Blog > Announcements

Matthew Emerick
06 Oct 2020
14 min read
How the depth and breadth of the Microsoft Azure partner ecosystem enables thousands of organizations to bring their mission-critical applications to Azure. In the past few years, IT organizations have been realizing compelling benefits when they transitioned their business-critical applications to the cloud, enabling them to address the top challenges they face with running the same applications on-premises. As even more companies embark on their digital transformation journey, the range of mission and business-critical applications has continued to expand, even more so because technology drives innovation and growth. This has further accelerated in the past months, spurred in part by our rapidly changing global economy. As a result, the definition of mission-critical applications is evolving and goes well beyond systems of record for many businesses. It’s part of why we never stopped investing across the platform to enable you to increase the availability, security, scalability, and performance of your core applications running on Azure. The expansion of mission-critical apps will only accelerate as AI, IoT, analytics, and new capabilities become more pervasive. We’re seeing the broadening scope of mission-critical scenarios both within Microsoft and in many of our customers’ industry sectors. For example, Eric Boyd, in his blog, outlined how companies in healthcare, insurance, sustainable farming, and other fields have chosen Microsoft Azure AI to transform their businesses. Applications like Microsoft Teams have now become mission-critical, especially this year, as many organizations had to enable remote workforces. This is also reflected by the sheer number of meetings happening in Teams. Going beyond Azure services and capabilities Many organizations we work with are eager to realize myriad benefits for their own business-critical applications, but first need to address questions around their cloud journey, such as: Are the core applications I use on-premises certified and supported on Azure? As I move to Azure, can I retain the same level of application customization that I have built over the years on-premises? Will my users experience any impact in the performance of my applications? In essence, they want to make sure that they can continue to capitalize on the strategic collaboration they’ve forged with their partners and ISVs as they transition their core business processes to the cloud. They want to continue to use the very same applications that they spent years customizing and optimizing on-premises. Microsoft understands that running your business on Azure goes beyond the services and capabilities that any platform can provide. You need a comprehensive ecosystem. Azure has always been partner-oriented, and we continue to strengthen our collaboration with a large number of ISVs and technology partners, so you can run the applications that are critical to the success of your business operations on Azure. A deeper look at the growing spectrum of mission-critical applications Today, you can run thousands of third-party ISV applications on Azure. Many of these ISVs in turn depend on Azure to deliver their software solutions and services. Azure has become a mission-critical platform for our partner community as well as our customers. When most people think of mission-critical applications, enterprise resource planning systems (ERP), supply chain management (SCM), product lifecycle management (PLM), and customer relationship management (CRM) applications are often the first examples that come to mind. However, to illustrate the depth and breadth of our mission-critical ecosystem, consider these distinct and very different categories of applications that are critical for thousands of businesses around the world: Enterprise resource planning (ERP) systems. Data management and analytics applications. Backup, and business continuity solutions. High-performance computing (HPC) scenarios that exemplify the broadening of business-critical applications that rely on public cloud infrastructure. Azure’s deep ecosystem addresses the needs of customers in all of these categories and more. ERP systems When most people think of mission-critical applications ERP, SCM, PLM, and CRM applications are often the first examples that come to mind. Some examples on Azure include: SAP—We have been empowering our enterprise customers to run their most mission-critical SAP workloads on Azure, bringing the intelligence, security, and reliability of Azure to their SAP applications and data. Viewpoint, a Trimble company—Viewpoint has been helping the construction industry transform through integrated construction management software and solutions for more than 40 years. To meet the scalability and flexibility needs of both Viewpoint and their customers, a significant portion of their clients are now running their software suite on Azure and experiencing tangible benefits. Data management and analytics Data is the lifeblood of the enterprise. Our customers are experiencing an explosion of mission-critical data sources, from the cloud to the edge, and analytics are key to unlocking the value of data in the cloud. AI is a key ingredient, and yet another compelling reason to modernize your core apps on Azure. DataStax—DataStax Enterprise, a scale out, hybrid, cloud-native NoSQL database built on Apache Cassandra™, in conjunction with Azure, can provide a foundation for personalized, real-time scalable applications. Learn how this combination can enable enterprises to run mission critical workloads to increase business agility, without compromising compliance and data governance. Informatica—Informatica has been working with Microsoft to help businesses ensure that the data that is driving your customer and business decisions is trusted, authenticated, and secure. Specifically, Informatica is focused on the quality of the data that is powering your mission-critical applications and can help you derive the maximum value from your existing investments. SAS®—Microsoft and SAS are enabling customers to easily run their SAS workloads in the cloud, helping them unlock critical value from their digital transformation initiatives. As part of our collaboration, SAS is migrating its analytical products and industry solutions onto Azure as the preferred cloud provider for the SAS Cloud. Discover how mission-critical analytics is finding a home in the cloud. Backup and disaster recovery solutions Uptime and disaster recovery plans that minimize recovery time objective (RTO) and recovery point objective (RPO) are the top metrics senior IT decision-makers pay close attention to when it comes to mission-critical environments. Backing up critical data is a key element of putting in place robust business continuity plans. Azure provides built-in backup and disaster recovery features, and we also partner with industry leaders like Commvault, Rubrik, Veeam, Veritas, Zerto, and others so you can keep using your existing applications no matter where your data resides. Commvault—We continue to work with Commvault to deliver data management solutions that enable higher resiliency, visibility, and agility for business-critical workloads and data in our customers’ hybrid environments. Learn about Commvault’s latest offerings—including support for Azure VMware Solution and why their Metallic SaaS suite relies exclusively on Azure. Rubrik—Learn how Rubrik helps enterprises achieve low RTOs, self-service automation at scale, and accelerated cloud adoption. Veeam—Read how you can use Veeam’s solution portfolio to backup, recover, and migrate mission-critical workloads to Azure. Veritas—Find out how Veritas InfoScale has advanced integration with Azure that simplifies the deployment and management of your mission-critical applications in the cloud. Zerto—Discover how the extensive capabilities of Zerto’s platform help you protect mission critical applications on Azure. Teradici—Finally, Teradici underscores how the lines between mission-critical and business-critical are blurring. Read how business continuity plans are being adjusted to include longer term scenarios. HPC scenarios HPC applications are often the most intensive and highest-value workloads in a company, and are business-critical in many industries, including financial services, life sciences, energy, manufacturing and more. The biggest and most audacious innovations from supporting the fight against COVID-19, to 5G semiconductor design; from aerospace engineering design processes to the development of autonomous vehicles, and so much more are being driven by HPC. Ansys—Explore how Ansys Cloud on Azure has proven to be vital for business continuity during unprecedented times. Rescale—Read how Rescale can provide a turnkey platform for engineers and researchers to quickly access Azure HPC resources, easing the transition of business-critical applications to the cloud. You can rely on the expertise of our partner community Many organizations continue to accelerate the migration of their core applications to the cloud, realizing tangible and measurable value in collaboration with our broad partner community, which includes global system integrators like Accenture, Avanade, Capgemini, Wipro, and many others. For example, UnifyCloud recently helped a large organization in the financial sector modernize their data estate on Azure while achieving 69 percent reduction in IT costs. We are excited about the opportunities ahead of us, fueled by the power of our collective imagination. Learn more about how you can run business-critical applications on Azure and increase business resiliency. Watch our Microsoft Ignite session for a deeper diver and demo.   “The construction industry relies on Viewpoint to build and host the mission-critical technology used to run their businesses, so we have the highest possible standards when it comes to the solutions we provide. Working with Microsoft has allowed us to meet those standards in the Azure cloud by increasing scalability, flexibility and reliability – all of which enable our customers to accelerate their own digital transformations and run their businesses with greater confidence.” —Dan Farner, Senior Vice President of Product Development, Viewpoint (a Trimble Company) Read the Gaining Reliability, Scalability, and Customer Satisfaction with Viewpoint on Microsoft Azure blog.     “Business critical applications require a transformational data architecture built on scale-out data and microservices to enable dramatically improved operations, developer productivity, and time-to-market. With Azure and DataStax, enterprises can now run mission critical workloads with zero downtime at global scale to achieve business agility, compliance, data sovereignty, and data governance.”—Ed Anuff, Chief Product Officer, DataStax Read the Application Modernization for Data-Driven Transformation with DataStax Enterprise on Microsoft Azure blog.     “As Microsoft’s 2020 Data Analytics Partner of Year, Informatica works hand-in-hand with Azure to solve mission critical challenges for our joint customers around the world and across every sector.  The combination of Azure’s scale, resilience and flexibility, along with Informatica’s industry-leading Cloud-Native Data Management platform on Azure, provides customers with a platform they can trust with their most complex, sensitive and valuable business critical workloads.”—Rik Tamm-Daniels, Vice President of strategic ecosystems and technology, Informatica Read the Ensuring Business-Critical Data Is Trusted, Available, and Secure with Informatica on Microsoft Azure blog.       “SAS and Microsoft share a vision of helping organizations make better decisions as they strive to serve customers, manage risks and improve operations. Organizations are moving to the cloud at an accelerated pace. Digital transformation projects that were scheduled for the future now have a 2020 delivery date. Customers realize analytics and cloud are critical to drive their digital growth strategies. This partnership helps them quickly move to Microsoft Azure, so they can build, deploy, and manage analytic workloads in a reliable, high-performant and cost-effective manner.”—Oliver Schabenberger, Executive Vice President, Chief Operating Officer and Chief Technology Officer, SAS Read the Mission-critical analytics finds a home in the cloud blog.   “Microsoft is our Foundation partner and selecting Microsoft Azure as our platform to host and deliver Metallic was an easy decision. This decision sparks customer confidence due to Azure’s performance, scale, reliability, security and offers unique Best Practice guidance for customers and partners. Our customers rely on Microsoft and Azure-centric Commvault solutions every day to manage, migrate and protect critical applications and the data required to support their digital transformation strategies.”—Randy De Meno, Vice President/Chief Technology Officer, Microsoft Practice & Solutions Read the Commvault extends collaboration with Microsoft to enhance support for mission-critical workloads blog.     “Enterprises depend on Rubrik and Azure to protect mission-critical applications in SAP, Oracle, SQL and VMware environments. Rubrik helps enterprises move to Azure securely, faster, and with a low TCO using Rubrik’s automated tiering to Azure Archive Storage. Security minded customers appreciate that with Rubrik and Microsoft, business critical data is immutable, preventing ransomware threats from accessing backups, so businesses can quickly search and restore their information on-premises and in Azure.”—Arvind Nithrakashyap, Chief Technology Officer and Co-Founder, Rubrik Learn how enterprises use Rubrik on Azure.     “Veeam continues to see increased adoption of Microsoft Azure for business-critical applications and data across our 375,000 plus global customers. While migration of applications and data remains the primary barrier to the public cloud, we are committed to helping eliminate these challenges through a unified Cloud Data Management platform that delivers simplicity, flexibility and reliability at its core, while providing unrivaled data portability for greater cost controls and savings. Backed by the unique Veeam Universal License – a portable license that moves with workloads to ensure they're always protected – our customers are able to take control of their data by easily migrating workloads to Azure, and then continue protecting and managing them in the cloud.”—Danny Allan, Chief Technology Officer and Senior Vice President for Product Strategy, Veeam Read the Backup, recovery, and migration of mission-critical workloads on Azure blog.     “Thousands of customers rely on Veritas to protect their data both on-premises and in Azure. Our partnership with Microsoft helps us drive the data protection solutions that our enterprise customers rely on to keep their business-critical applications optimized and immediately available.”—Phil Brace, Chief Revenue Officer, Veritas Read the Migrate and optimize your mission-critical applications in Microsoft Azure with Veritas InfoScale blog.     “Microsoft has always leveraged the expertise of its partners to deliver the most innovative technology to customers. Because of Zerto’s long-standing collaboration with Microsoft, Zetro's IT Resilience platform is fully integrated with Azure and provides a robust, fully orchestrated solution that reduces data loss to seconds and downtime to minutes. Utilizing Zerto’s end-to-end, converged backup, DR, and cloud mobility platform, customers have proven time and time again they can protect mission-critical applications during planned or unplanned disruptions that include ransomware, hardware failure, and numerous other scenarios using the Azure cloud – the best cloud platform for IT resilience in the hybrid cloud environment.”—Gil Levonai, CMO and SVP of Product, Zerto Read the Protecting Critical Applications in the Cloud with the Zerto Platform blog.     “The longer business continues to be disrupted, the more the lines blur and business critical functions begin to shift to mission critical, making virtual desktops and workstations on Microsoft Azure an attractive option for IT managers supporting remote workforces in any function or industry. Teradici Cloud Access Software offers a flexible and secure solution that supports demanding business critical and mission critical workloads on Microsoft Azure and Azure Stack with exceptional performance and fidelity, helping businesses gain efficiency and resilience within their business continuity strategy.”—John McVay, Director of Strategic Alliances, Teradici Read the Longer IT timelines shift business critical priorities to mission critical blog.         "It is imperative for Ansys to support our customers' accelerating needs for on-demand high performance computing to drive their increasingly complex engineering requirements. Microsoft Azure, with its purpose-built HPC and robust go-to market capabilities, was a natural choice for us, and together we are enabling our joint customers to keep designing innovative products even as they work from home.”—Navin Budhiraja, Vice President and General Manager, Cloud and Platform, Ansys Read the Ansys Cloud on Microsoft Azure: A vital resource for business continuity during the pandemic blog.     “Robust and stable business critical systems are paramount for success. Rescale customers leveraging Azure HPC resources are taking advantage of the scalability, flexibility and intelligence to improve R&D, accelerate development and reduce costs not possible with a fixed infrastructure.”—Edward Hsu, Vice President of Product, Rescale Read the Business Critical Systems that Drive Innovation blog.     “Customers are transitioning business-critical workloads to Azure and realizing significant cost benefits while modernizing their applications. Our solutions help customers develop cloud strategy, modernize quickly, and optimize cloud environments while minimizing risk and downtime.”—Vivek Bhatnagar, Co-Founder and Chief Technology Officer, UnifyCloud Read the Moving mission-critical applications to the cloud: More important than ever blog.
Read more
  • 0
  • 0
  • 1518

article-image-python-3-9-0-is-now-available-and-you-can-already-test-3-10-0a1-from-python-insider
Matthew Emerick
05 Oct 2020
2 min read
Save for later

Python 3.9.0 is now available, and you can already test 3.10.0a1! from Python Insider

Matthew Emerick
05 Oct 2020
2 min read
On behalf of the Python development community and the Python 3.9 release team, I’m pleased to announce the availability of Python 3.9.0. Python 3.9.0 is the newest feature release of the Python language, and it contains many new features and optimizations. You can find Python 3.9.0 here: https://www.python.org/downloads/release/python-390/ Most third-party distributors of Python should be making 3.9.0 packages available soon. See the “What’s New in Python 3.9” document for more information about features included in the 3.9 series. Detailed information about all changes made in 3.9.0 can be found in its change log. Maintenance releases for the 3.9 series will follow at regular bi-monthly intervals starting in late November of 2020. OK, boring! Where is Python 4? Not so fast! The next release after 3.9 will be 3.10. It will be an incremental improvement over 3.9, just as 3.9 was over 3.8, and so on. In fact, our newest Release Manager, Pablo Galindo Salgado, prepared the first alpha release of what will become 3.10.0 a year from now. You can check it out here: https://www.python.org/downloads/release/python-3100a1/ We hope you enjoy the new releases! Thanks to all of the many volunteers who help make Python Development and these releases possible! Please consider supporting our efforts by volunteering yourself or through organization contributions to the Python Software Foundation. https://www.python.org/psf/ More resources Online Documentation PEP 596, 3.9 Release Schedule PEP 619, 3.10 Release Schedule Report bugs at https://bugs.python.org. Help fund Python and its community. Your friendly release team, Ned Deily @nad Steve Dower @steve.dower Pablo Galindo Salgado @pablogsal Łukasz Langa @ambv  
Read more
  • 0
  • 0
  • 1978

article-image-what-is-google-cloud-certification-and-should-i-get-it-from-android-development-android-authority
Matthew Emerick
04 Oct 2020
5 min read
Save for later

What is Google Cloud certification and should I get it? from Android Development – Android Authority

Matthew Emerick
04 Oct 2020
5 min read
Google Cloud certification demonstrates an individual’s proficiency using Google Cloud technologies. It shows that a service provider or prospective employee can maintain and implement Google Cloud services and products into a company’s workflow. These skills are highly sought after and may lead to improved job prospects and salary. To learn more about Google Cloud certification, keep reading! What is Google Cloud? Google Cloud, or Google Cloud Platform (GCP), is a cloud platform or “Infrastructure as a Service” (IaaS). As such, GCP offers a broad range of services and products that businesses can use to improve the products and services they provide to clients and end-users. See also: What is Google Cloud? Google Cloud services are hosted on servers that are available 24/7 and that can scale to meet the demands of the customer. These services fall into a number of categories: Cloud storage and backup Database management Developer tools Internet of Things (IoT) Machine learning Analytics For example, a company might use a Google Cloud server in order to handle voice recognition or to store a user’s credentials to be accessed across multiple devices. Cloud CND is “fast, reliable web and video content delivery with global scale and reach.” Compute Engine is “computing infrastructure in predefined or custom machine sizes.” The list goes on. Any of these services can help to extend a company’s reach, but implementation can be complex and it requires significant technical understanding. This is why a company may wish to hire a professional with Google Cloud certification. Is Google Cloud certification right for you? If you are an IT professional, you may be considering Google Cloud certification. But is it right for you? While certifications can be a great way to make a resume more attractive to employees and clients, they are less important than experience and qualifications. Remember: a company can always provide Google Cloud training after making a hire! Think of it as the “icing on the cake” that may help you to stand out against the competition. What’s more, Google Cloud certification will only be useful to those companies that plan on using these technologies (or that can be convinced that they should!). That also means the hiring company must choose Google Cloud over the competing cloud platforms: Amazon Web Services (AWS) and Microsoft Azure. Google Cloud has less market share than either of these options, so you may want to consider those certifications instead. See also: AWS vs Azure vs Google Cloud – Which certification is best for professionals? That being said, Google Cloud has specific use-cases where it is clearly the best choice. In particular, Google Cloud comes out on top for machine learning thanks to its powerful Tensor Flow platform. More and more companies are turning to machine learning to solve a wide range of challenges, so this may be a useful move to “futureproof” your career. GCP also has some impressively large clients to its name: Snap, Spotify, Best Buy, Gartner, and Coca-Cola. The best option is to seek out as many certifications as you can, to appeal to the broadest range of employers, while demonstrating a wide knowledge-base and aptitude for learning. What you need to know Strengthening the case for Google Cloud certification is its relatively low price. Certification only costs $125 for an “Associate” certificate and $200 for a “Professional” certificate. There is only one Associate-level certificate at the time of writing, which is the Associate Cloud Engineer. This basic certification covers the knowledge necessary for the day-to-day maintenance of existing GCP implementations. Professional certificates, meanwhile, cover the skills necessary to design and implement new solutions. There are a total of 7 certifications: Associate Cloud Engineer Professional Cloud Architect Professional Data Engineer Professional Cloud DevOps Engineer Professional Cloud Network Engineer Professional Cloud Security Engineer Professional Collaboration Engineer Exams are not graded, rather examinees will be awarded a simple “pass” or “fail.” Each exam is two hours long and must be taken at a Kryterion testing center. There are over 1,000 testing centers located across 120 different countries. Once complete, certification is valid for two years. Preparing for Google Cloud certification Before attempting an exam, it is highly recommended that you spend some time educating yourself with regards to the platform. A great way to do this is with online courses, which make it easy to learn from the comfort of your home. Many of these provide specific exam preparation. Android Authority has partnered with many leading course providers to offer huge discounts to our readers. You can get the GCP: Complete Google Data Engineer and Cloud Architect Guide for just $9, rather than the usual $199! Or how about the even more comprehensive Google Cloud Mastery Bundle for just $39, down from $1,400! It’s also recommended that you try the practice exams and exam guides provided by Google. Think you’re ready? Then register for your certification here. Good luck!
Read more
  • 0
  • 0
  • 1441
article-image-setting-up-a-webserver-to-use-https-from-linux-com
Matthew Emerick
03 Oct 2020
1 min read
Save for later

Setting up a webserver to use HTTPS from Linux.com

Matthew Emerick
03 Oct 2020
1 min read
Click to Read More at Enable Sysadmin The post Setting up a webserver to use HTTPS appeared first on Linux.com.
Read more
  • 0
  • 0
  • 787

article-image-all-the-latest-android-developer-news-and-features-you-need-to-know-about-from-android-development-android-authority
Matthew Emerick
02 Oct 2020
4 min read
Save for later

All the latest Android developer news and features you need to know about from Android Development – Android Authority

Matthew Emerick
02 Oct 2020
4 min read
Credit: Edgar Cervantes / Android Authority For Android devs, September was marked by one very big development: the release of Android 11! The new operating system is now officially here, so time is up for developers to ensure their apps and projects are ready. In other big news, we learned that Nvidia would be buying the processor architecture company Arm. Find out more about that and other news, along with a bunch of new tutorials and features for devs below. News and features from Android Authority Android 11 review: The devil is in the details – Our full coverage of the new Android launch. Nvidia to buy Arm for $40 billion from SoftBank – Nvidia confirmed this month that it would be acquiring Arm — the processor architecture company that makes the chips in most modern smartphones (not to mention IoT devices and much more). This deal, estimated to be worth $40 billion, represents a big shakeup in the mobile tech industry. How to use classes in Java – This post explains how to create and use classes in Java. This fundamental skill is a requirement for any Java-based Android development. Understanding variables in Java – Even more fundamental than classes, but there’s more to variables in Java than you might assume! How to create an array in Java – An array is a variable that can store multiple values. Learn about the different types and what you can do with them. NullPointerException in Java – Explaining the billion-dollar mistake – Learn more about this infamous exception, so that you can avoid it in your own code. What is Azure certification – While not specifically Android-dev related, Azure certification can help you land a job as a developer or IT professional, and better understand how to integrate cloud services into your apps and projects. What is AWS certification – AWS is the largest cloud platform right now with a huge 33% market share. AWS vs Azure vs Google Cloud – But which cloud platform to learn? This article will explain the differences. How to use loops in Java – The key to making iterative changes, game loops, sorting files, and more. Try catch Java: Exception handling explained – This post teaches you how to catch problems and what to do with them! How to use if statements in Java – The key to understanding how to code – Once you understand variables and if statements, the sky is the limit! How to use a web API from your Android app – More ways to outsource powerful features for your apps! The latest from Android Developers Blog Turning it up to 11: Android 11 for developers – Got to love the pun! Yes, the big news in September was, of course, the release of Android 11. We’ve discussed the features developers need to be aware of at length already, including enhanced 5G support, conversation notifications, one-time permissions, and more. This is no longer a drill! See also: How to make sure your app is ready for Android 11 Lockscreen and authentication improvements in Android 11 – This post explains the concept of the tiered authentication model used in Android 11, the different classes of biometrics, and the new BiometricPrompt APIs and features. Introducing Android 11 on Android TV – Android 11 has also been officially launched for Android TV! The update brings privacy and performance updates, as well as several new features such as a “low latency mode” and extended gamepad support. Improve Your Game with Texture Compression Format Targeting – Developers can now use Google Play Asset Delivery to include textures, using multiple texture compression formats. Android GPU Inspector Open Beta – With Android 11 available on Pixel, the Android GPU Inspector (AGI) is now in open Beta. This will allow developers to find the causes of slow-down in their apps and games more easily. Prefer Storing Data with Jetpack DataStore – Jetpack DataStore is a new storage solution that replaces SharedPreferences, now available in alpha. Other news and features from around the web On the Unity side of things, September also gave us the launch of Unity 2020.2 beta, along with the option to offload project builds with Unity Build Server. Gary Sims explained the Nvidia/Arm deal as only he can. We also got the Microsoft Surface Duo launch. Devs interested in creating experiences for the new hardware can find instructions here. And that about wraps it up for September. Let us know if we missed something below and how you’re enjoying development for Android 11!
Read more
  • 0
  • 0
  • 1623