Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech Guides

852 Articles
article-image-5-javascript-machine-learning-libraries-you-need-to-know
Pravin Dhandre
08 Jun 2018
3 min read
Save for later

5 JavaScript machine learning libraries you need to know

Pravin Dhandre
08 Jun 2018
3 min read
Technologies like machine learning, predictive analytics, natural language processing and artificial intelligence are the most trending and innovative technologies of 21st century. Whether it is an enterprise software or a simple photo editing application, they all are backed and rooted in machine learning technology making them smart enough to be a friend to humans. Until now, the tools and frameworks that were capable of running machine learning were majorly developed in languages like Python, R and Java. However, recently the web ecosystem has picked up machine learning into its fold and is achieving transformation in web applications. Today in this article, we will look at the most useful and popular libraries to perform machine learning in your browser without the need of softwares, compilers, installations and GPUs. TensorFlow.js GitHub: 7.5k+ stars With the growing popularity of TensorFlow among machine learning and deep learning enthusiasts, Google recently released TensorFlowjs, the JavaScript version of TensorFlow. With this library, JavaScript developers can train and deploy their machine learning models faster in browser without much hassle. This library is speedy, tensile, scalable and a great start to practically experience the taste of machine learning. With TensorFlow.js, importing existing models and retraining pretrained model is a piece of cake. To check out examples on tensorflow.js, visit GitHub repository. ConvNetJS GitHub: 9k+ stars ConvNetJS provides neural networks implementation in JavaScript with numerous demos of neural networks available on GitHub repository. The framework has a good number of active followers who are programmers and coders. The library provides support to various neural network modules, and popular machine learning techniques like Classification and Regression. Developers who are interested in getting reinforcement learning onto the browser or in training complex convolutional networks, can visit the ConvNetJS official page. Brain.js GitHub: 8k+ stars Brain.js is another addition to the web development ecosystem that brings smart features onto the browser with just a few lines of code. Using Brain.js, one can easily create simple neural networks and can develop smart functionality in their browser applications without much of the complexity. It is already preferred by web developers for client side applications like in-browser games or placement of Ads, or for character recognition. You can checkout its GitHub repository to see a complete demonstration of approximating XOR function using brain.js. Synaptic GitHub: 6k+ stars Synaptic is a well-liked machine learning library for training recurrent neural networks as it has in-built architecture-free generalized algorithm. Few of the in-built architectures include multilayer perceptrons, LSTM networks and Hopfield networks. With Synaptic, you can develop various in-browser applications such as Paint an Image, Learn Image Filters, Self-Organizing Map or Reading from Wikipedia. Neurojs GitHub: 4k+ stars Another recently developed framework especially for reinforcement learning tasks in your browser, is neurojs. It mainly focuses on Q-learning, but can be used for any type of neural network based task whether it is for building a browser game or an autonomous driving application. Some of the exciting features this library has to offer are full-stack neural network implementation, extended support to reinforcement learning tasks, import/export of weight configurations and many more. To see the complete list of features, visit the GitHub page. How should web developers learn machine learning? NVIDIA open sources NVVL, library for machine learning training Build a foodie bot with JavaScript
Read more
  • 0
  • 0
  • 5665

article-image-voice-natural-language-and-conversations-are-they-the-next-web-ui
Sugandha Lahoti
08 Jun 2018
5 min read
Save for later

Voice, natural language, and conversations: Are they the next web UI?

Sugandha Lahoti
08 Jun 2018
5 min read
Take any major conference that happened this year, Google I/O, Apple’s WWDC, or Microsoft Build. A major focus of all these conferences by top-notch tech leaders is improving User experience, smoothing out the process of how a user experiences their products. In present times, the user experience is heavily dependent on how a system interacts with a human. It may be either through responsive web designs, or appealing architectures. It may also be through an interactive module, such as a conversational UI, a chatbot, or a voice interface—essentially the same thing albeit with slight changes in their definition. Irrespective of they are called, these UX models have one fixed goal: to improve the interaction between a human, and a system, such that it feels real. In our recently conducted Packt Skill-up survey 2018, we asked developers and tech pros about whether Conversational User Interfaces and chatbots are going to be the future for web UI? Well, it seems yes, as over 65% of respondents, agreed that chat interactions and Conversational User Interfaces are the future of the web. After the recent preview of the power of Google Duplex, those numbers might be even higher if asked again today. Why has this paradigm of interacting with the web shifted from text and even visual searches on mobile to Voice, Natural language, and conversation UI? Why is Apple’s Siri, Google’s Voice assistant, Microsoft’s Cortana, Amazon Echo, releasing new versions every day? Computing power & NLP, the two pillars Any chatbot, or voice interface, requires two major factors to make them successful. One being computational power, which makes a conversational UI process complex calculations. And natural language processing, which actually makes a chatbot conduct human-like conversations. Both these areas have made tremendous progress in the recent times. A large number of computational chips namely GPUs, TPUs, as well as quantum computers, are being developed, which are capable of processing complex calculations in a jiffy. NLP has also gained momentum both in speech recognition capabilities (understanding language) and artificial intelligence (learning from experience). As technology in these areas expands, it paves way for companies to adopt conversational UIs as their main user interface. The last thing we need is more apps There are already a very large number of apps (read millions) available in app stores and they are increasing every day. We are almost at the peak of the hype cycle. And there is only downfall from here. Why? Well, I’m sure, you’ll agree, downloading, setting up, and managing an app is a hassle, not to mention, humans have limited attention spans, so switching between multiple apps happens quite often. Conversational UIs are rapidly taking up the vacuum left behind by mobile apps. They integrate functionalities of multiple apps in one. So you have a simple messaging app, which can also book cabs, search, and shop or order food. Moreover, they can simplify routine tasks. AI enabled chatbots, can remind you of scheduled meetings, bring up the news for you every morning, analyze your refrigerator for food items to be replenished, and update your shopping cart, all with simple commands. Advancements in deep learning have also produced, what are known as therapist bots. Users can confide in bots just as they do with human friends when they have a broken heart, have lost a job, or have been feeling down. (This view does assume that the service provider respects the users’ privacy and adheres to strict policies related to data privacy and protection.) The end of screen-based interfaces Another flavor of Conversational UI is the Voice User interfaces (VUI). Typically, we interact with a device directly through a touchscreen or indirectly with a remote control. However, VUI is the touch-less version of technology where you only need to think aloud with your voice. These interfaces can work solo, like Amazon Echo, or Google Home or be combined with text-based chatbots, like Apple Siri, Google voice assistant etc. You simply need to say a command or type it, and the task is done. “Siri, Text Robert, I’m running late for the meeting.” And boy! Are voice user interfaces growing rapidly! Google Duplex, announced at Google I/O 2018, can even make phone calls for the users imitating human natural conversation almost perfectly. In Fact, it also adds pause-fillers and phrases such as “um”, “uh-huh “, and “erm” to make the conversation sound as natural as possible. Voice interfaces also work amazingly for people with disabilities including Visual imparities. Users, who are unable to use screens and keyboards, can use VUI for their dat-to-day tasks. Here’s a touching review of Amazon Echo shared by a wheelchair-bound user about how the device changed his life. The world is being swept over by the wave of Conversational UI, Google duplex being the latest example. As AI deepens its roots, across the technology ecosystem, intelligent assistant applications like, Siri, Duplex, Cortana will advance. This boom will push us closer to Zero UI, a seamless and interactive UI which eradicates the barrier between user and device. Top 4 chatbot development frameworks for developers How to create a conversational assistant or chatbot using Python Building a two-way interactive chatbot with Twilio: A step-by-step guide
Read more
  • 0
  • 0
  • 3691

article-image-alibaba-the-dark-horse-in-the-public-cloud-race
Gebin George
07 Jun 2018
3 min read
Save for later

Why Alibaba cloud could be the dark horse in the public cloud race

Gebin George
07 Jun 2018
3 min read
Public cloud market seems to be highly dominated by industry giants from the west like Amazon Web Services (AWS) and Microsoft Azure. One of China’s tech giants, Alibaba cloud has entered the public cloud market recently and seems to be catching up pretty quickly with its Software-as-a-Service (SaaS). Infrastructure-as-Service (IaaS), Platform-as-a-Service offerings. According to reports, in December 2017, Alibaba cloud witnessed 56% YoY growth and revenue as good as 12.8 billion USD. It is expected to be better in the Q2 2018 report, where-in the market size will increase to a sizeable amount. Alibaba cloud is already leading China’s cloud market share. It provides around 100 core services, with datacenter spread around 17 regions as a whole. Some of the stunning features of Alibaba cloud include: Elastic Computing ECS services of Alibaba cloud are highly scalable, quick and powerful with high-range Intel CPUs, which brings down the latency to give staggering results. It comes with extra security layer for protecting applications from DDoS and Trojan attacks. The services involved here includes ECS, container services, Autoscaling and so on. Networking Alibaba cloud enables you with hybrid and distributed network, ideal for enterprises which demand high network coverages. This network involves communication between two VPCs and communication between VPCs and IDCs. Security It has a built-in Anti-DDoS management and security assessment services. This definitely reduces the cost of hiring and training quality security engineers to analyze and manage security services and data breaches. Storage and CDN Alibaba cloud’s OSS (Object Storage Service) helps you store, backup, and archive huge amount of data on cloud. This service is absolutely flexible and you only need to pay as per your usage and there are no additional cost involved in it. Analytics It comprises of a wide range of analytics services like business analytics, data processing, stream analytics and so on. Services like Elastic MapReduce, Apache Hadoop and Apache Spark can be run easily on Alibaba cloud for efficient Cloud Analytics. For detailed products and services from Alibaba cloud, refer their official site. AWS and Azure were dominating the public cloud market with an array of services which changed as per the market requirements. Considering the current advancements in the Alibaba cloud and its affordable and highly competitive price range, Alibaba joins the others in the race to dominate the public cloud market. Microsoft Build 2018 Day 1: Azure meets Artificial Intelligence How to create your own AWS CloudTrail Google announce the largest overhaul of their Cloud Speech-to-Text
Read more
  • 0
  • 0
  • 2337

article-image-a-tale-of-two-tools-tableau-and-power-bi
Natasha Mathur
07 Jun 2018
11 min read
Save for later

A tale of two tools: Tableau and Power BI

Natasha Mathur
07 Jun 2018
11 min read
Business professionals are on a constant look-out for a powerful yet cost-effective BI tool to ramp up the operational efficiency within organizations. Two tools that are front-runners in the Self-Service Business Intelligence field currently are Tableau and Power BI. Both tools, although quite similar in nature, offer different features. Most experts say that the right tool depends on the size, needs and the budget of an organization, but when compared closely, one of them clearly beats the other in terms of its features. Now, instead of comparing the two based on their pros and cons, we’ll let Tableau and Power BI take over from here to argue their case covering topics like features, usability, to pricing and job opportunities. For those of you who aren’t interested in a good story, there is a summary of the key points at the end of the article comparing the two tools. [box type="shadow" align="" class="" width=""] The clock strikes 2’o'clock for a meeting on a regular Monday afternoon. Tableau, a market leader in Business Intelligence & data analytics and Power BI; another standout performer and Tableau’s opponent in the field of Business Intelligence head off for a meeting with the Vendor. The meeting where the vendor is finally expected to decide to which tool their organization should pick for their BI needs. With Power BI and Tableau joining the Vendor, the conversation starts on a light note with both tools introducing themselves to the Vendor. Tableau: Hi, I am Tableau, I make it easy for companies all around the world to see and understand their data. I provide different visualization tools, drag & drop features, metadata management, data notifications, etc, among other exciting features. Power BI: Hello, I am Power BI, I am a cloud-based analytics and Business Intelligence platform. I provide a full overview of critical data to organizations across the globe. I allow companies to easily share data by connecting the data sources and helping them create reports. I also help create scalable dashboards for visualization. The vendor nods convincingly in agreement while making notes about the two tools. Vendor: May I know what each one of you offers in terms of visualization? Tableau: Sure, I let users create 24 different types of baseline visualizations including heat maps, line charts and scatter plots. Not trying to brag, but you don’t need intense coding knowledge to develop high quality and complex visualizations with me. You can also ask me ‘what if’ questions regarding the data. I also provide unlimited data points for analysis. The vendor seems noticeably pleased with Tableau’s reply. Power BI: I allow users to create visualizations by asking questions in natural language using Cortana. Uploading data sets is quite easy with me. You can select a wide range of visualizations as blueprints. You can then insert data from the sidebar into the visualization. Tableau passes a glittery infectious smirk and throws a question towards Power BI excitedly. Tableau: Wait, what about data points? How many data points can you offer? The Vendor looks at Power BI with a straight face, waiting for a reply. Power BI: For now, I offer 3500 data points for data analysis. Vendor: Umm, okay, but, won’t the 3500 data point limit the effectiveness for the users? Tableau cuts off Power BI as it tries to answer and replies back to the vendor with a distinct sense of rush in its voice. Tableau: It will! Due to the 3500 data point limit, many visuals can't display a large amount of data, so filters are added. As the data gets filtered automatically, it leads to outliers getting missed. Power BI looks visibly irritated after Tableau’s response and looks at the vendor for slight hope, while vendor seems more inclined towards Tableau. Vendor: Okay. Noted. What can you tell me about your compatibility with data sources? Tableau: I support hundreds of data connectors. This includes online analytical processing (OLAP), big data options (such as NoSQL, Hadoop) as well as cloud options. I am capable of automatically determining the relationship between data when added from multiple sources. I also let you modify data links or create them manually based on your company’s preferences. Power BI: I help connect to users’ external sources including SAP HANA, JSON, MySQL, and more. When data is added from multiple sources, I can automatically determine the relationships between them. In fact, I let users connect to Microsoft Azure databases, third-party databases, files and online services like Salesforce and Google Analytics. Vendor: Okay, that’s great! Can you tell me what your customer support is like? Tableau jumps in to answer the question first yet again. Tableau: I offer direct support by phone and email. Customers can also login to the customer portal to submit a support ticket. Subscriptions are provided based on three different categories namely desktop, server and online. Also, there are support resources for different subscription version of the software namely Desktop, Server, and Online. Users are free to access the support resources depending upon the version of the software. I provide getting started guides, best practices as well as how to use the platform’s top features. A user can also access Tableau community forum along with attending training events. The vendor seems highly pleased with Tableau’s answer and continues scribbling in his notebook. Power BI: I offer faster customer support to users with a paid account. However, all users can submit a support ticket. I also provide robust support resources and documentation including learning guides, a user community forum and samples of how my partners use the platform.  Though customer support functionality is limited for users with a free Power BI account. Vendor: Okay, got it! Can you tell me about your learning curves? Do you get along well with novice users too or just professionals? Tableau: I am a very powerful tool and data analysts around the world are my largest customer base. I must confess, I am not quite intuitive in nature but given the powerful visualization features that I offer, I see no harm in people getting themselves acquainted with data science a bit before they decide to choose me. In a nutshell, it can be a bit tricky to transform and clean visualizations with me for novices. Tableau looks at the vendor for approval but he is just busy making notes. Power BI: I am the citizen data scientists’ ally. From common stakeholders to data analysts, there are features for almost everyone on board as far as I am concerned. My interface is quite intuitive and depends more on drag and drop features to build visualizations. This makes it easy for the users to play around with the interface a bit. It doesn’t matter whether you’re a novice or pro, there’s space for everyone here. A green monster of jealousy takes over Tableau as it scoffs at Power BI. Tableau: You are only compatible with Windows. I, on the other hand, am compatible with both Windows and Mac OS. And let’s be real it’s tough to do even simple calculations with you, such as creating a percent-of-total variable, without learning the DAX language. As the flood of anger rises in Power BI, Vendor interrupts them. Vendor: May I just ask one last question before I get ready with the results? How heavy are you on my pockets? Power BI: I offer three subscription plans namely desktop, pro, and premium. Desktop is the free version. Pro is for professionals and starts at $9.99 per user per month. You get additional features such as data governance, content packaging, and distribution. I also offer a 60 day trial with Pro. Now, coming to Premium, it is built on a capacity pricing. What that means is that I charge you per node per month. You get even more powerful features such as premium version cost calculator for custom quote ranges. This is based on the number of pro, frequent and occasional users that are active on an account’s premium version. The vendor seems a little dazed as he continues making notes. Tableau: I offer three subscriptions as well, namely Desktop, Server, and Online. Prices are charged per user per month but billed annually. Desktop category comes with two options: Personal edition (starting at $35) and professional edition (starting at $70). The server option offers on-premises or public cloud capabilities, starting at $35 while the Online version is fully hosted and starts at $42. I also offer a free version namely Tableau Public with which users can create visualizations, save them and share them on social media or their blog. There is a 10GB storage limit though. I also offer 14 days free trial for users so that they can get a demo before the purchase. Tableau and Power BI both await anxiously for the Vendor’s reply as he continued scribbling in his notebook while making weird quizzical expressions. Vendor: Thank you so much for attending this meeting. I’ll be right back with the results. I just need to check on a few things. Tableau and power BI watch the vendor leave the room and heavy anticipation fills the room. Tableau: Let’s be real, I will always be the preferred choice for data visualization. Power BI: We shall see that. Don’t forget that I also offer data visualization tools along with predictive modeling and reporting. Tableau: I have a better job market! Power BI: What makes you say that? I think you need to re-check the Gartner’s Magic Quadrant as I am right beside you on that. Power BI looks at Tableau with a hot rush of astonishment as the Vendor enters the room. The vendor smiles at Tableau as he continues the discussion which makes Power BI slightly uneasy. Vendor: Tableau and Power BI, you both offer great features but as you know I can only pick one of you as my choice for the organizations. An air of suspense surrounds the atmosphere. Vendor: Tableau, you are a great data visualization tool with versatile built-in features such as user interface layout, visualization sharing, and intuitive data exploration. Power BI, you offer real-time data access along with some pretty handy drag and drop features. You help create visualizations quickly and provide even the novice users an access to powerful data analytics without any prior knowledge. The tension notched up even more as the Vendor kept talking. Vendor: Tableau! You’re a great data visualization tool but the price point is quite high. This is one of the reasons why I choose Microsoft Power BI. Microsoft Power BI offers data visualization, connects to external data sources, lets you create reports, etc, all at low cost. Hence, Power BI, welcome aboard! A sense of infinite peace and pride emanates from Power BI. The meeting ends with Power BI and Vendor shaking hands as Tableau silently leaves the room. [/box] We took a peek into the Vendor’s notebook and saw this comparison table. Power BI Tableau Visualization capabilities Good Very Good Compatibility with multiple Data sources Good Good Customer Support Quality Good Good Learning Curve Very Good Good System Compatibility Windows Windows & Mac OS Cost Low Very high Job Market Good Good Analytics Very Good Good Both the Business Intelligence tools are in demand by organizations all over the world. Tableau is fast and agile. It provides a comprehensible interface along with visual analytics where users have the ability to ask and answer questions. Its versatility and success stories make it a good choice for organizations willing to invest in a higher budget Business Intelligence software. Power BI, on the other hand, offers almost similar features as Tableau including data visualization, predictive modeling, reporting, data prep, etc, at one of the lowest subscription prices today in the market. Nevertheless, there are upgrades being made to both of the Business Intelligence tools, and we can only wait to see what’s more to come in these technologies. Building a Microsoft Power BI Data Model “Tableau is the most powerful and secure end-to-end analytics platform”: An interview with Joshua Milligan“ Unlocking the secrets of Microsoft Power BI      
Read more
  • 0
  • 0
  • 5106

Banner background image
article-image-why-aws-is-the-prefered-cloud-platform-for-developers-working-with-big-data
Savia Lobo
07 Jun 2018
4 min read
Save for later

Why AWS is the preferred cloud platform for developers working with big data

Savia Lobo
07 Jun 2018
4 min read
The cloud computing revolution has well and truly begun. But the market is fiercely competitive - there are a handful of cloud vendors leading the pack and it’s not easy to say which one is best. AWS, Google Cloud Platform, Microsoft Azure, and Oracle are leading the way when it comes to modern cloud-based infrastructure and it’s hard to separate them. Big data is in high demand as businesses can flesh out useful insights. Organizations carry out advanced analytics in order to leverage deep and exploratory perspective on the data. After a deep analysis is performed, BI tools such as Tableau, Microsoft Power BI, Qlik Sense, and so on, are used in drafting out dashboard visualizations, reports, performance management metrics etc. that makes the data analytics actionable. Thus, we see how analytics and BI tools are important in getting the best out of big data. In this year’s Skill Up survey, there emerged a frontrunner for developers: AWS Source : Packt Skill Up Survey Let’s talk AWS Amazon is said to outplay any other cloud platform players in the market. AWS provides its customers with a highly robust infrastructure with commendable security options. In its inception year, 2006, AWS already had more than 150,000 developers who signed up to use the AWS services. Amazon announced this at a press release that year. In a recent survey conducted by the Synergy Research, AWS is among the top cloud platform providers with a 35% market share. The top customers of AWS include NASA, Netflix, Adobe Systems, Airbnb, and many more. Cloud technology is not a new and emerging trend anymore and has truly become mainstream. What sets AWS on a different plateau is, it has caught developers’ attention by its impressive suite of developer tools. It’s a cloud platform that is designed with continuous delivery and DevOps in mind. AWS: Every developer’s den Once you’re an AWS’ member, you can experience hundreds of different platforms that it offers. Starting form Core Computation and Content Delivery Networks, one can even take advantage of the IoT and game development platforms. If you’re worried how to payback for all that you have used, don’t worry, AWS offers its complete package of solutions across six modes of payments. It also offers hundreds of templates in every programming language to glide along one’s choice of project. Pay-as-you-go feature in AWS enables customers to use the features that are required. This avoids unnecessary buying of resources that would add no value to businesses. Security on AWS is something users appreciate. AWS’ configuration options, management policies, and their reliable security are the reasons why one can easily trust their cloud services. AWS has layers of security encryptions that enable high-end user data protection. It also decides on user privileges using the IAM (Identity and Access Management) roles. This helps to keep restrictions on the number of resources used by the user. It also helps in greatly reducing malpractices. AWS provides developers with autoscaling, as it is one of the most important features that every developer needs. With AutoScaling, developers can suspend their unimportant management issues on autopilot; AWS takes care of it. Developers can instead focus more on processes, development, and programming. The free tier within AWS runs an Amazon EC2, which includes an S3 storage, EC2 compute hours, Elastic Load balancer time, and so on. This enables developers to try AWS’ API within their software to enhance it further. AWS cuts down deployment time required to provision a web server. By using Amazon Machine Images, one can have a machine deployed and ready to accept connections in a short time. Amazon’s logo says it all. It provides A to Z services under one hood for developers, businesses,and for general users. Each service is tailored to serve different purposes and also has a dedicated and specialized hardware. Developers can easily choose Amazon for their development needs with their pay-as-you-go and make the most of it without even buying stuff. Though, there are other service providers such as Microsoft Azure, Google Cloud Platform and so on, Amazon offers functionalities which others are yet to match. Verizon chooses Amazon Web Services(AWS) as its preferred cloud provider How to secure ElasticCache in AWS How to create your own AWS CloudTrail
Read more
  • 0
  • 0
  • 4459

article-image-top-5-cybersecurity-assessment-tools-for-networking-professionals
Savia Lobo
07 Jun 2018
6 min read
Save for later

Top 5 cybersecurity assessment tools for networking professionals

Savia Lobo
07 Jun 2018
6 min read
Security is one of the major concerns while setting up data centers in the cloud. Although firewalls and managed networking components are deployed by most of the organizations for their data centers, they still fear being attacked by intruders. As such, organizations constantly seek tools that can assist them in gauging how vulnerable their network is and how they can secure their applications therein. Many confuse security assessment with penetration testing and also use it interchangeably. However, there is a notable difference between the two. Security assessment is a process of finding out the different vulnerabilities within a system and prioritize them based on severity and business criticality. On the other hand, penetration testing simulates a real-life attack and maps out paths that a real attacker would take to fulfill the attack. You can check out our article, Top 5 penetration testing tools for ethical hackers to know about some of the pentesting tools. Plethora of tools in the market exist and every tool claims to be the best. Here is our top 5 list of tools to secure your organization over the network. Wireshark Wireshark is one of the popular tools for packet analysis. It is open source under GNU General Public License. Wireshark has a user-friendly GUI  and supports Command Line Input (CLI). It is a great debugging tool for developers who wish to develop a network application. It runs on multiple platforms including Windows, Linux, Solaris, NetBSD, and so on. WireShark community also hosts SharkFest, launched in 2008, for WireShark developers and the user communities. The main aim of this conference is to support Wireshark development and to educate current and future generations of computer science and IT professionals on how to use this tool to manage, troubleshoot, diagnose, and secure traditional and modern networks. Some benefits of using this tool include: Wireshark features live real-time traffic analysis and also supports offline analysis. Depending on the platform, one can read live data from Ethernet, PPP/HDLC, USB, IEEE 802.11, Token Ring, and many others. Decryption support for several protocols such as IPsec, ISAKMP, Kerberos, SNMPv3, SSL/TLS, WEP, and WPA/WPA2 Network captured by this tool can be browsed via a GUI, or via the TTY-mode TShark utility. Wireshark also has the most powerful display filters in whole industry It also provides users with Tshark, a network protocol analyzer, used to analyze packets from the hosts without a UI. Nmap Network Mapper, popularly known as Nmap is an open source licensed tool for conducting network discovery and security auditing.  It is also utilized for tasks such as network inventory management, monitoring host or service uptime, and much more. How Nmap works is, it uses raw IP packets in order to find out the available hosts on the network, the services they offer, the OS on which they are operating, the firewall that they are currently using and much more. Nmap is a quick essential to scan large networks and can also be used to scan single hosts. It runs on all major operating system. It also provides official binary packages for Windows, Linux, and Mac OS X. It also includes Zenmap - An advanced security scanner GUI and a results viewer Ncat - This is a tool used for data transfer, redirection, and debugging. Ndiff - A utility tool for comparing scan results Nping - A packet generation and response analysis tool Nmap is traditionally a command-line tool run from a Unix shell or Windows Command prompt. This makes Nmap easy for scripting and allows easy sharing of useful commands within the user community. With this, experts do not have to move through different configuration panels and scattered option fields. Nessus Nessus, a product of the Tenable.io, is one of the popular vulnerability scanners specifically for UNIX systems. This tool remains constantly updated with 70k+ plugins. Nessus is available in both free and paid versions. The paid version costs around  $2,190 per year, whereas the free version, ‘Nessus Home’ offers limited usage and is licensed only for home network usage. Customers choose Nessus because It includes simple steps for policy creation and needs just a few clicks for scanning an entire corporate network. It offers vulnerability scanning at a low total cost of ownership (TCO) product One can carry out a quick and accurate scanning with lower false positives. It also has an embedded scripting language for users to write their own plugins and to understand the existing ones. QualysGuard QualysGuard is a famous SaaS (Software-as-a-Service) vulnerability management tool. It has a comprehensive vulnerability knowledge base, using which it is able to provide continuous protection against the latest worms and security threats. It proactively monitors all the network access points, due to which security managers can invest less time to research, scan, and fix network vulnerabilities. This helps organizations in avoiding network vulnerabilities before they could be exploited. It provides a detailed technical analysis of the threats via powerful and easy-to-read reports. The detailed report includes the security threat, the consequences faced if the vulnerability is exploited, and also a solution that recommends how the vulnerability can be fixed. One can get a summary of the overall security with QualysGuard’s executive dashboard. The dashboard displays a number of new, active, and re-opened vulnerabilities. It also displays a graph which showcases vulnerabilities based on severity level. Get to know more about QualysGuard on its official website. Core Impact Core Impact is widely used as a comprehensive tool to assess and test security vulnerability within any organization. It includes a large database of professional exploits and is regularly updated. It assists in cleanly exploiting one machine and later creating an encrypted tunnel through it to exploit other machines. Core Impact provides a controlled environment to mimic bad attacks. This helps one to secure their network before the occurrence of an actual attack. One interesting feature of Core Impact is that one can fully test their network, irrespective of the length, quickly and efficiently. These are five popular tools network security professionals use for assessing their networks. However, there are many other tools such as Netsparker, OpenVAS, Nikto, and many more for assessing the security of their network. Every security assessment tool is unique in its own way. However, it all boils down to one’s own expertise and the experience they have, and also the kind of project environment it is used in. Top 5 penetration testing tools for ethical hackers Intel’s Spectre variant 4 patch impacts CPU performance Pentest tool in focus: Metasploit
Read more
  • 0
  • 0
  • 10903
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-python-tensorflow-excel-and-more-data-professionals-reveal-their-top-tools
Amey Varangaonkar
06 Jun 2018
4 min read
Save for later

Python, Tensorflow, Excel and more - Data professionals reveal their top tools

Amey Varangaonkar
06 Jun 2018
4 min read
Data professionals are constantly on the lookout for the best tools to simplify their data science tasks - be it data acquisition, machine learning, or visualizing the results of the analysis. With so much on their plate already, having robust, efficient tools in the arsenal helps them a lot in reducing the procedural complexities. Not just that, the time taken to do these tasks is considerably reduced as well. But what tools do data professionals rely on to make their lives easier? Thanks to the Skill-up 2018 survey that we recently conducted, we have some interesting observations to share with you! Read the Skill Up report in full. Sign up to our weekly newsletter and download the PDF for free. Key Takeaways Python is the most widely used programming language by data professionals Python finds a wide adoption across all spectrums of data science - including data analysis, machine learning, deep learning and data visualization Excel continues to be favored by the data professionals because of its effectiveness and simplicity R is slowly falling behind Python in the race to Data Science supremacy Now, let’s look at these observations, in more depth. Python continues its ascension as the top dog Python’s rise in popularity as well as adoption over the last 3 years has been quite staggering, to say the least. Python’s ease of use, powerful analytical and machine learning capabilities as well as its applications outside of data science make it quite a popular language in the tech community. It thus comes as no surprise that it stood out from the others and was the undisputed choice of language for the data pros. R, on the other hand, seems to be finding it difficult to play catch-up to Python, with less than half the number of votes - despite being the tool of choice for many statisticians and researchers. Is the paradigm shift well and truly on? Is Python edging R out for good? Source: Packt Skill-Up Survey 2018 It is interesting to see SQL as the number 2, but considering the number of people working with databases these days it doesn’t come as a surprise. Also, JavaScript is preferred more than Java, indicating the rising need for web-based dashboards for effective Business Intelligence. Data professionals still love Excel, but Python libraries are taking over Microsoft Excel has traditionally been a highly popular tool for data analysis, especially when dealing with data with hundreds and thousands of records. Excel’s perfect setting for data manipulation and charting continues to be the reason why people still use it for basic-level data analysis, as indicated by our survey. Almost 53% of the respondents prefer having Excel in their analysis toolkit for their day to day tasks. Top libraries, tools and frameworks used by data professionals (Source: Packt Skill-Up Survey 2018) The survey also indicated Python’s rising dominance in the data science domain, with 8 out of the 10 most-used tools for data analysis being Python-based. Python’s offerings for data wrangling, scientific computing, machine learning and deep learning make its libraries the obvious choice for data professionals. Here’s a quick look at  15 useful Python libraries to make the above-mentioned data science tasks easier. Tensorflow and PyTorch are in demand AI’s popularity is soaring with every passing day as it finds applications across all types of industries and business domains. In our survey, we found machine learning and deep learning to be two of the most valuable skills to have for any data scientist, as can be seen from the word cloud below: Word cloud for the most valued skills by data professionals (Source: Packt Skill-Up Survey) Python’s two popular deep learning frameworks - Tensorflow and PyTorch have thus gained a lot of attention and adoption in the recent times. Along with Keras - another Python library - these two libraries are the most used frameworks used by data scientists and ML developers for building efficient machine learning and deep learning models. Which language/libraries do you use for your everyday Data Science tasks? Do you agree with your peers’ choice of tools? Feel free to let us know! Read more Data cleaning is the worst part of data analysis, say data scientists 30 common data science terms explained Top 10 deep learning frameworks
Read more
  • 0
  • 0
  • 5837

article-image-the-best-backend-tools-in-web-development
Sugandha Lahoti
06 Jun 2018
5 min read
Save for later

The best backend tools in web development

Sugandha Lahoti
06 Jun 2018
5 min read
If you’re a backend developer, it’s easy to feel overwhelmed by the range of backend development tools available. It goes without saying that you should use what works for you but sometimes it’s not that easy to even work that out. With this in mind, this year’s Skill Up report offers a useful insight into some of the most popular backend tools being used today. Let’s take a look at what tools came out on top. That should help you make decisions about what you’re going to use or maybe even just learn. Read the Skill Up report in full. Sign up to our weekly newsletter and download the PDF for free. Node.js More than 50% respondents said, they prefer Node.js, the popular server-side Javascript coding framework. Node.js is a Javascript runtime that runs on the V8 JavaScript runtime engine. Node.js adds capabilities to Javascript (front-end language) to let it do more than just creating interactive websites. It uses an event-driven, non-blocking I/O model that makes it lightweight and efficient. The latest stable release of Node, Node 10, will be the next candidate in line for the Long Term Support (LTS) in October 2018. Node.js 10.0 comes with plenty of new features like OpenSSL 1.1.0 security toolkit, upgraded npm, N-API, and much more. Get started with learning Node.js with the following books: Learning Node.js Development Learn Node.js by Building 6 Projects RESTful Web API Design with Node.js 10 - Third Edition ASP.NET Core The next popular alternative was ASP. NET Core with over 25% developers approving it as their choice of backend framework. ASP.NET Core is the open-source cross-platform framework for building backends, web apps and services, and IoT apps. According to the skill-up survey, it was also one of the most popular framework used by developers. It provides a cloud-ready, environment-based configuration system. It seamlessly integrates with popular client-side frameworks and libraries, including Angular, React, and Bootstrap. Get started with ASP.NET Core by reading: Learning ASP.NET Core 2.0 Mastering ASP.NET Core 2.0 ASP.NET Core 2 High Performance - Second Edition Express.js Developers and tech pros also like to work with Express JS, and hence it ranked No. 3 on our list. Express JS is the pre-built Node JS framework that can help developers build faster and smarter websites and web apps. Express basically extends Node.js to build complete web apps. It is the perfect framework to learn for developers, who are fluent in Node.js, but want to transition to creating apps from just server-side technologies. Express is lightweight and comes with extra, built-in web application features and the Express API to support the already robust, feature-packed Node.js platform. Express is not just limited to NodeJS. It also works seamlessly with other modules and offers HTTP utilities and middleware for creating APIs. It can help developers master single-page and multiple-page websites, as well as some complex web apps. You can go through Projects in ExpressJS [Video], a complete course to learn professional web development using Express.js. Laravel Next, was Laravel, a prominent member of a new generation of web frameworks. It is one of the most popular PHP frameworks and is also free and and open source. It features: A simple, fast routing engine Powerful dependency injection container Multiple back-ends for session and cache storage Database agnostic schema migrations Robust background job processing Real-time event broadcasting The latest stable release, Laravel 5 is a substantial upgrade with a lot of new toys, at the same time retaining the features that made Laravel wildly successful. It comes with plenty of architectural as well as design-based changes. Start building with Laravel with these videos. Beginning Laravel [Video] Laravel Foundations: Basics to Every App [Video] Java EE The fifth most popular choice of backend tool is the Java EE. The Enterprise Java standard or Java EE is a collection of technologies and APIs for the Java platform designed to support Enterprise. By enterprise, we mean applications classified as large-scale, distributed, transactional and highly-available, designed to support mission-critical business requirements. Applications written to comply with the Java EE specification do not tie developers to a specific vendor; instead, they can be deployed to any Java EE compliant application server. The Java EE server application implements the Java EE platform APIs and provides the standard Java EE services. The latest stable release, Java EE 8 brings with it a load of features, mainly targeting newer architectures such as microservices, modernized security APIs, and cloud deployments. Our best picks for learning Java EE: Java EE 8 Application Development Architecting Modern Java EE Applications Java EE 8 High Performance The other backend tools which were among the top picks by developers included: Spring, a programming and configuration model for building modern Java-based enterprise applications, on any kind of deployment platform. Django, a powerful Python web framework for creating RESTful web services. It reduces the amount of trivial code, which simplifies the creation of web applications and results in faster development. Flask, a framework for building web servers in Python. It is a micro framework, meaning it’s not a full stack web application development framework. It just gives the developers very basics to get a web server running. Firebase, Google’s mobile platform to help developers run mobile backend code without managing servers and develop high-quality apps. Ruby on Rails, one of the oldest, backend technology. A certain percentage of people still prefer using ruby on rails for their backend code. Rails is a flexible and IDE friendly framework with easy functions and manipulations and the support of the powerful ruby language. The entire skill up survey report can be read on the Packt website, which details on what developers think about the changing tech landscape and the parameters that are driving that change. This survey report is launched at the start of the Skill Up campaign, where every eBook and video will be available for $10. Go grab your free content now!
Read more
  • 0
  • 0
  • 10038

article-image-what-are-web-developers-favorite-front-end-tools-packts-skill-up-report-reveals-all
Sugandha Lahoti
06 Jun 2018
5 min read
Save for later

What are web developers favorite front-end tools? Packt’s Skill Up report reveals all

Sugandha Lahoti
06 Jun 2018
5 min read
Are you confused about which front-end tools you should learn, and which ones you should work with? Do you want to know other web developers are using and what they think is important when it comes to front-end frameworks and libraries?? Fear not! We have the answer to all these questions. In our annual skill-up survey, we spoke to over 8,000 developers and tech pros and asked them about the front-end tools, libraries, and frameworks they regularly use. Of course, choosing the perfect front-end technology depends on your skills, and your area of interest. However, seeing the umpteen number of front-end frameworks available nowadays, we have narrowed it down to just a few. jQuery [box type="shadow" align="alignleft" class="" width=""]Creators: John Resig. Released: 2006 Current version: 3.3.1 Popularity: 49,100 stars on GitHub[/box] jQuery came out to be the undisputed champion of this showdown, with over 70% of respondents choosing it as their go-to front-end library. jQuery is a fast, lightweight and concise Javascript library mainly used for HTML document traversing, event handling, animating, and Ajax interactions for rapid web development. It’s a cross-platform JavaScript library meaning it simplifies Javascript development by reducing coding time. Although Javascript has a large no. of libraries jQuery stands out because of its unlimited tutorials, no cross-platform/browser issues, great user interface, a large no, of plugins, and light, fast, and quick to learn nature. Essentially, jQuery is most suited for applications which need rapid development. Get started with jQuery with the following resources: Getting Started with jQuery 3 [Video] Beginning JavaScript and jQuery [Video] Bootstrap [box type="shadow" align="alignleft" class="" width=""]Creators: Mark Otto and Jacob Thornton. Released: 2011 Current version: 4.1.1 Popularity: 125,108 stars on GitHub[/box] Over 65% of developers choose Bootstrap as their favorite framework to use. And why not, considering Bootstrap is everywhere. Bootstrap is an open source toolkit for developing with HTML, CSS, and JS. The huge popularity of Bootstrap is mainly because of its simplistic use, great community, and a very large number of articles and tutorials, third-party plugins and extensions, theme builders, etc. Our top picks for learning Bootstrap: Bootstrap 4 Site Blueprints Learning Web Development with React and Bootstrap Bootstrap 4 Cookbook Npm (Node package manager) [box type="shadow" align="alignleft" class="" width=""] Creators: Isaac Z. Schlueter. Released: 2011 Current version: 6.1.0 Popularity:16,597 stars on GitHub[/box] NPM is the package manager for node. With NPM, developers can install various modules for web development, share and borrow packages, and manage private development. It consists of three distinct components: Website: The website can be used to discover packages, set up profiles, and manage other aspects of the npm experience. Command Line Interface (CLI): The CLI runs from a terminal. Developers can interact with npm through CLI. Registry: The registry is a large public database of JavaScript software and the meta-information surrounding it. It contains over 600,000 packages (building blocks of code). AngularJS [box type="shadow" align="alignleft" class="" width=""] Creators: Google. Released: 2016 Current version: 6.0.0 Popularity:58,576 stars on GitHub[/box] If you’re looking to build a dynamic and robust Single page applications, AngularJS is the framework you need. Angular is a core part of the MEAN stack (MongoDB, ExpressJS, AngularJS, and Node.js)  So now you can use javascript for both your site’s frontend and backend. Angular is highly modular, making it great for dividing up large-scale work with a team and also makes testing and debugging easy. It pairs with AJAX for amazing speed and can handle heavy user interaction via forms. The functionality-first approach makes Angular more focused on features, making the developers’ jobs easier. Moreover, it has excellent tools and support from the Google community. Start learning Angular with these books. Learning Angular - Second Edition Essential Angular 4 Angular By Example - Third Edition Webpack [box type="shadow" align="alignleft" class="" width=""] Creators: Tobias Koppers, Sean Larkin, Johannes Ewald, Juho Vepsäläinen, and Kees Kluskens. Released: 2012 Current version: 4.8.3 Popularity:41,366 stars on GitHub[/box] Webpack is a module bundler for modern JavaScript applications. Webpack is a tool that’s been around for a number of years but has recently seen its popularity grow. And this is the reason developers awarded it the no. 5 spot on our list. Webpack, is, quite simply brings all the assets you need in front-end development – like JavaScript, fonts, and images, in one place. This is particularly useful if you’re developing complicated front ends. You can go through Deploying Web Applications with Webpack to get up-and-running with Webpack. The other frontend tools which were among the top picks by developers included: Sass: Sass is a web design framework, a CSS preprocessor, which adds special features such as variables, nested rules, and mixins into regular CSS. React: React is one of the most popular javascript libraries for building UI interfaces. It provides speed, simplicity, and scalability for creating single-page applications and mobile applications. Gulp: An open-source javascript toolkit, Gulp is mainly used for automation tasks such as bundling and minifying libraries and stylesheets, quickly running unit tests, running code analysis, etc. Vue: Incrementally adoptable, Vue is a fast-growing Javascript framework. It is much simpler than Angular, both in terms of API, so in terms of design. It is a presentation layer, instead of a full-scale framework. Thus you can easily combine Vue with other libraries. The entire skill up survey report can be downloaded from the Packt website, which details what developers think and feel about the changing tech landscape. Developers think managers don’t know enough about technology. And that’s hurting business. Don’t call us ninjas or rockstars, say, developers 96% of developers believe developing soft skills is important
Read more
  • 0
  • 0
  • 4232

article-image-10-types-dos-attacks-you-need-to-know
Savia Lobo
05 Jun 2018
11 min read
Save for later

The 10 most common types of DoS attacks you need to know

Savia Lobo
05 Jun 2018
11 min read
There are businesses that are highly dependent on their services hosted online. It's important that their servers are up and running smoothly during their business hours. Stock markets and casinos are examples of such institutions. They are businesses that deal with a huge sum of money and they expect their servers to work properly during their core business hours. Hackers may extort money by threatening to take down or block these servers during these hours. Denial of service (DoS) attack is the most common methodology used to carry out these kinds of attacks. In this post, we will get to know about  DoS attacks and their various types. This article is an excerpt taken from the book, 'Preventing Ransomware' written by Abhijit Mohanta, Mounir Hahad, and Kumaraguru Velmurugan. In this book, you will learn how to respond quickly to ransomware attacks to protect yourself. What are DoS attacks? DoS is one of the oldest forms of cyber extortion attack. As the term indicates, distributed denial of service (DDoS) means it denies its service to a legitimate user. If a railway website is brought down, it fails to serve the people who want to book tickets. Let's take a peek into some of the details. A DoS attack can happen in two ways: Specially crafted data: If specially crafted data is sent to the victim and if the victim is not set up to handle the data, there are chances that the victim may crash. This does not involve sending too much data but includes specially crafted data packets that the victim fails to handle. This can involve manipulating fields in the network protocol packets, exploiting servers, and so on. Ping of death and teardrop attacks are examples of such attacks. Flooding: Sending too much data to the victim can also slow it down. So it will spend resources on consuming the attackers' data and fail to serve the legitimate data. This can be a DDoS attack where packets are sent to the victim by the attacker from many computers. Attacks can also use a combination of both. For example, UDP flooding and SYN flooding are examples of such attacks. There is another form of DoS attack called a DDoS attack. A DoS attack uses a single computer to carry out the attack. A DDoS attack uses a series of computers to carry out the attack. Sometimes the target server is flooded with so much data that it can't handle it. Another way is to exploit the workings of internal protocols. A DDoS attack that deals with extortion is often termed a ransom DDoS. We will now talk about various types of the DoS attacks that might occur. Teardrop attacks or IP fragmentation attacks In this type of attack, the hacker sends a specially crafted packet to the victim. To understand this, one must have knowledge of the TCP/IP protocol. In order to transmit data across networks, IP packets are broken down into smaller packets. This is called fragmentation. When the packets finally reach their destination, they are re-assembled together to get the original data. In the process of fragmentation, some fields are added to the fragmented packets so that they can be tracked at the destination while reassembling. In a teardrop attack, the attacker crafts some packets that overlap with each other. Consequently, the operating system at the destination gets confused about how to reassemble the packets and hence it crashes. User Datagram Protocol flooding User Datagram Protocol (UDP) is an unreliable packet. This means the sender of the data does not care if the receiver has received it. In UDP flooding, many UDP packets are sent to the victim at random ports. When the victim gets a packet on a port, it looks out for an application that is listening to that port. When it does not find the packet, it replies back with an Internet Control Message Protocol (ICMP) packet. ICMP packets are used to send error messages. When a lot of UDP packets are received, the victim consumes a lot of resources in replying back with ICMP packets. This can prevent the victim from responding to legitimate requests. SYN flood TCP is a reliable connection. That means it makes sure that the data sent by the sender is completely received by the receiver. To start a communication between the sender and receiver, TCP follows a three-way handshake. SYN denotes the synchronization packet and ACK stands for acknowledgment: The sender starts by sending a SYN packet and the receiver replies with SYN-ACK. The sender sends back an ACK packet followed by the data. In SYN flooding, the sender is the attacker and the receiver is the victim. The attacker sends a SYN packet and the server responds with SYN-ACK. But the attacker does not reply with an ACK packet. The server expects an ACK packet from the attacker and waits for some time. The attacker sends a lot of SYN packets and the server waits for the final ACK until timeout. Hence, the server exhausts its resources waiting for ACK. This kind of attack is called SYN flooding. Ping of death While transmitting data over the internet, the data is broken into smaller chunks of packets. The receiving end reassembles these broken packets together in order to derive a conclusive meaning. In a ping of death attack, the attacker sends a packet larger than 65,536 bytes, the maximum size of a packet allowed by the IP protocol. The packets are split and sent across the internet. But when the packets are reassembled at the receiving end, the operating system is clueless about how to handle these bigger packets, so it crashes. Exploits Exploits for servers can also cause DDoS vulnerability. A lot of web applications are hosted on web servers, such as Apache and Tomcat. If there is a vulnerability in these web servers, the attacker can launch an exploit against the vulnerability. The exploit need not necessarily take control, but it can crash the web server software. This can cause a DoS attack. There are easy ways for hackers to find out the web server and its version if the server has default configurations. The attacker finds out the possible vulnerabilities and exploits for that web server. If the web server is not patched, the attacker can bring it down by sending an exploit. Botnets Botnets can be used to carry out DDoS attacks. A botnet herd is a collection of compromised computers. The compromised computers, called bots, act on commands from a C&C server. These bots, on the commands of the C&C server, can send a huge amount of data to the victim server, and as a result, the victim server is overloaded: Reflective DDoS attacks and amplification attacks In this kind of attack, the attacker uses a legitimate computer to launch an attack against the victim by hiding its own IP address. The usual way is the attacker sends a small packet to a legitimate machine after forging the sender of the packet to look as if it has been sent from the victim. The legitimate machine will, in turn, send the response to the victim. If the response data is large, the impact is amplified. We can call the legitimate computers reflectors and this kind of attack, where the attacker sends small data and the victim receives a larger amount of data, is called an amplification attack. Since the attacker does not directly use computers controlled by him and instead uses legitimate computers, it's called a reflective DDoS attack: The reflectors are not compromised machines, unlike botnets. Reflectors are machines that respond to a particular request. It can be a DNS request or a Networking Time Protocol (NTP) request, and so on. DNS amplification attacks, WordPress pingback attacks, and NTP attacks are amplification attacks. In a DNS amplification attack, the attacker sends a forged packet to the DNS server containing the IP address of the victim. The DNS server replies back to the victim instead with larger data. Other kinds of amplification attack include SMTP, SSDP, and so on. We will look at an example of such an attack in the next section. The computers that are used to send traffic to the victim are not the compromised ones and are called reflectors. There are several groups of cyber criminals responsible for carrying out ransom DDoS attacks, such as DD4BC, Armada Collective, Fancy Bear, XMR-Squad, and Lizard Squad. These groups target enterprises. They will first send out an extortion email, followed by an attack if the victim does not pay the ransom. DD4BC The DD4BC group was seen operating in 2014. It charged Bitcoins as the extortion fee. The group mainly targeted media, entertainment, and financial services. They would send a threatening email stating that a low-intensity DoS attack will be carried out first. They would claim that they will protect the organization against larger attacks. They also threatened that they will publish information about the attack in social media to bring down the reputation of the company: Usually, DD4DC are known to exploit a bug WordPress pingback vulnerability. We don't want to get into too much detail about this bug or vulnerability. Pingback is a feature provided by WordPress through which the original author of the WordPress site or blog gets notified where his site has been linked or referenced. We can call the site which refers to the original site as the referrer and the original site as the original. If the referrer uses the original, it sends a request called a pingback request to the original which contains the URL of itself. This is a kind of notification to the original site from the referrer informing that it is linking to the original site. Now the original site downloads the referrer site as a response to the pingback request as per the protocol designed by WordPress and this action is termed as a reflection. The WordPress sites used in the attack are called reflectors. So an attacker can misuse it by creating a forged pingback request with a URL of a victim site and send it to the WordPress sites. The attack uses these WordPress sites in the attack. As a result, the WordPress sites respond to the victim. Put simply, the attack notifies the WordPress sites that the victim has referred them on his/her site. So all the WordPress sites try to connect to the victim, which overloads the victim. If the victim's web page is large and the WordPress sites try to download it, then it chokes the bandwidth and this is called amplification: Armada Collective The Armada Collective group was first seen in 2015. They attacked various financial services and web hosting sites in Russia, Switzerland, Greece, and Thailand. They again re-emerged in Central Europe in October 2017. They used to carry out a demo-DDoS attack to threaten the victim. Here is an extortion letter from Armada Collective: This group is known to carry out reflective DDoS attacks through NTP. The NTP protocol is a protocol that is used to synchronize computer clock times in a protocol. The NTP protocol provides a support for a monlist command for administrative purpose. When an administrator sends the monlist command to an NTP server, the server responds with a list of 600 hosts that are connected to that NTP server. The attacker can exploit this by creating a forged NTP packet which has a monlist command containing the IP address of the victim and then sending multiple copies to the NTP server. The NTP server thinks that the monlist request has come from the victim address and sends a response which contains a list of 600 computers connected to that server. Thus the victim receives too much data from the NTP response and it can crash: Fancy Bear Fancy Bear is one of the hacker groups we have known about since 2010. Fancy Bear threatened to use Mirai Botnet in the attack. Mirai Botnet was known to target Linux operating systems used in IoT devices. It was mostly known to infect CCTV cameras. Here is a letter from Fancy Bear: We have talked about a few groups that were infamous for carrying out DoS extortion and some of the techniques used by them. We explored different types of DoS attacks and how they can occur. If you've enjoyed this excerpt, check out 'Preventing Ransomware' to know in detail about the latest ransomware attacks involving WannaCry, Petya, and BadRabbit. Anatomy of a Crypto Ransomware Barracuda announces Cloud-Delivered Web Application Firewall service Top 5 penetration testing tools for ethical hackers
Read more
  • 0
  • 0
  • 52496
article-image-top-languages-for-artificial-intelligence-development
Natasha Mathur
05 Jun 2018
11 min read
Save for later

Top languages for Artificial Intelligence development

Natasha Mathur
05 Jun 2018
11 min read
Artificial Intelligence is one of the hottest technologies currently. From work colleagues to your boss, chances are that most (yourself included) wish to create the next big AI project. Artificial Intelligence is a vast field and with thousands of languages to choose from, it can get a bit difficult to pick the language that will bring the most value to your project. For anyone wanting to dive in the AI space, the initial stage of choosing the right language can really decelerate the development process. Moreover, making a right choice about the language for the Artificial Intelligence development depends on your skills and needs. Following are the top 5 programming languages for Artificial Intelligence development: 1.Python Python, hands down, is the number one programming language when it comes to Artificial Intelligence development. Not only is it one of the most popular languages in the field of data science, machine learning, and Artificial Intelligence in general, it is also popular among game developers, web developers, cybersecurity professionals and others. It offers a ton of libraries and frameworks in Machine Learning and Deep Learning that are extremely powerful and essential for AI development such as TensorFlow, Theano, Keras, Scikit Learn, etc. Python is the go-to language for AI development for most people, novices and experts alike. Pros It’s quite easy to learn due to its simple syntax. This helps in implementing the AI algorithms in a quick and easy manner. Development is faster in Python as compared to Java, C++ or Ruby. It is a multi-paradigm programming language and supports object-oriented, functional and procedure-oriented programming languages. Python has a ton of libraries and tools to offer. Python libraries such as Scikit-learn, Numpy, CNTK, etc are quite trending. It is a portable language and can be used on multiple operating systems namely Windows, Mac OS, Linux, and Unix. Cons Integration of the AI systems with non-Python infrastructure. For e.g. for an infrastructure built around Java, it would be advisable to build deep learning models using Java rather than Python. If you are a data scientist, a machine learning developer or just a domain expert like a bioinformatician who hasn’t yet learned a programming language, Python is your best bet. It is easy to learn, translate equations and logic well in few lines of code and has a rich development ecosystem. 2.  C++ C++  comes second on the list when it comes to top 5 programming languages for Artificial Intelligence development. There are cases where C++ supersedes Python even though it is not the most common language when talking about AI development. For instance, when working with an embedded environment where you don’t want a lot of overhead due to Java Virtual Machine or Python Interpreter; C++ is a perfect choice. C++ also consists of some popular libraries and frameworks in AI, machine learning and deep learning namely, Mlpack, shark, OpenNN, Caffe, Dlib, etc. Pros Execution in C++ is very fast which is why it can be the go-to language when it comes to AI projects that are time-sensitive. It offers substantial use of algorithms. It uses statistical AI techniques quite effectively. Data hiding and inheritance make it possible to reuse the existing code during the development process. It is also suitable for machine learning and Neural Networks. Cons It follows a bottom-up approach and this makes it very complex for large-scale projects. If you are a game developer, you’ve already dabbled with C++ in some form or the other. Given the popularity of C++ among developers, it goes without saying, that if you choose C++, it can definitely kickstart your AI development process to build smarter, more interactive games. 3. Java Java is a close contender to C++. From Machine Learning to Natural language processing, Java comes with a plethora of libraries for all aspects of Artificial Intelligence development. Java has all the infrastructure that you need to create your next big AI project. Some popular Java libraries and frameworks are Deeplearning4j, Weka, Java-ML, etc. Pros Java follows the once Written Read/Run Anywhere (WORA) principle. It is a time-efficient language as it can be run on any platform without the need for re-compilation every time because of Virtual Machine Technology. Java works well for search algorithms, neural networks, and NLP. It is a multi-paradigm language i.e. it supports object-oriented, procedure-oriented and functional programming languages. It is easy to debug. Cons As mentioned, Java has a complex and verbose code structure which can be a bit time-consuming as it increases the development time. If you are into development of software, web, mobile or anywhere in between, you’ve worked with Java at some point, probably you still are. Most commercial apps have Java baked in them. The familiarity and robustness that Java has to offer is a good reason to pick Java when working with AI development. This is especially relevant if you want to enter well-established domains like banking that are historically built on top of Java-based systems. 4. Scala Just like Java, Scala belongs to the JVM family. Scala is a fairly new language in the AI space but it’s finding quite a bit of recognition recently in many corporations and startups. It has a lot to offer in terms of convenience which is why developers enjoy working with it. Also, ScalaNLP, DeepLearning4j, etc are all tools and libraries that make the AI development process a bit easier with Scala. Let’s have a look at the features that make it a good choice for AI development. Pros It’s good for projects that need scalability. It combines the strengths of Functional and Imperative programming models to act as a powerful tool which helps build highly concurrent applications while reaping the benefits of an OO approach at the same time. It provides good concurrency support which helps with projects involving real-time parallelized analytics. Scala has a good open source community when it comes to statistical learning, information theory and Artificial Intelligence in general. Cons Scala falls short when it comes to machine learning libraries. Scala consists of concepts such as implicits as well as type classes. These might not be familiar to programmers coming from the object-oriented world. The learning curve in Scala is steep. Even though Scala lacks in machine learning libraries, its scalability, and concurrency support makes it a good option for AI development. With more companies such as IBM and lightbend collaborating together to use Scala for building more AI applications, it’s no secret that Scala’s use for AI development is on constant demand in the present as well as for the future. 5. R R is a language that’s catching up in the race recently for AI development. Primarily used for academic research, R is written by statisticians and it provides basic data management which makes tasks really easy. It’s not as pricey as statistical software namely Matlab or SAS, which makes it a great substitute for this software and a golden child of data science. Pros R comes with plenty packages that help boost its performance. There are packages available for pre-modeling, modeling and post modeling stages in data analysis. R is very efficient in tasks such as continuous regression, model validation, and data visualization. R being a statistical language offers very robust statistical model packages for data analysis such as caret, ggplot, dplyr, lattice, etc which can help boost the AI development process. Major tasks can be done with little code developed in an interactive environment which makes it easy for the developers to try out new ideas and verify them with varied graphics functions that come with R. Cons R’s major drawback is its inconsistency due to third-party algorithms. Development speed is quite slow when it comes to R as you have to learn new ways for data modeling. You also have to make predictions every time when using a new algorithm. R is one of those skills that’s mainly demanded by recruiters in data science and machine learning. Overall, R is a very clever language. It is freely available, runs on server as well as common hardware. R can help amp up your AI development process to a great extent. Other languages worth mentioning There are three other languages that deserve a mention in this article: Go, Lisp and Prolog. Let’s have a look at what makes these a good choice for AI development. Go Go has been receiving a lot of attention recently. There might not be as many projects available in AI development using Go as for now but the language is on its path to continuous growth these days. For instance, AlphaGo, is a first computer program in Go that was able to defeat the world champion human Go player, proves how powerful the language is in terms of features that it can offer. Pros You don’t have to call out to libraries, you can make use of Go’s existing machine learning libraries. It doesn’t consist of classes. It only consists of packages which make the code cleaner and clear. It doesn’t support inheritance which makes it easy to modify the code in Go. Cons There aren’t many solid libraries for core AI development tasks. With Go, it is possible to pull off core ML and some reinforcement learning tasks as well, despite the lack of libraries. But given other versatile features of Go, the future looks bright for this language with it finding more applications in AI development. Lisp Lisp is one of the oldest languages for AI development and as such gets an honorary mention. It is a very popular language in AI academic research and is equally effective in the AI development process as well. However, it is not such a usual choice among the developers of recent times. Also, most modern libraries in machine learning, deep learning, and AI are written in popular languages such as C++, Python, etc. But I wouldn’t write off Lisp yet. It still has an immense capacity to build some really innovative AI projects, if take the time to learn it. Pros Its flexible and extendable nature enables fast prototyping, thereby, providing developers with the needed freedom to quickly test out ideas and theories. Since it was custom built for AI, its symbolic information processing capability is above par. It is suitable for machine learning and inductive learning based projects. Recompilation of functions alongside the running program is possible which saves time. Cons Since it is an old language, not a lot of developers are well-versed with it. Also, new software and hardware have to be configured to be able to accommodate using Lisp. Given the vintage nature of Lisp for the AI world, it is quite interesting to see how things work in Lisp for AI development.  The most famous example of a lisp-based AI project is DART (Dynamic Analysis and Replanning Tool), used by the U.S. military. Prolog Finally, we have Prolog, which is another old language primarily associated with AI development and symbolic computation. Pros It is a declarative language where everything is dictated by rules and facts. It supports mechanisms such as tree-based data structuring, automatic backtracking, nondeterminism and pattern matching which is helpful for AI development. This makes it quite a powerful language for AI development. Its varied features are quite helpful in creating AI projects for different fields such as medical, voice control, networking and other such Artificial development projects. It is flexible in nature and is used extensively for theorem proving, natural language processing, non-numerical programming, and AI in general. Cons High level of difficulty when it comes to learning Prolog as compared to other languages. Apart from the above-mentioned features, implementation of symbolic computation in other languages can take up to tens of pages of indigestible code. But the same algorithms implemented in Prolog results in a clear and concise program that easily fits on one page. So those are the top programming languages for Artificial Intelligence development. Choosing the right language eventually depends on the nature of your project. If you want to pick an easy to learn language go for Python but if you are working on a project where speed and performance are most critical then pick C++. If you are a creature of habit, Java is a good choice. If you are a thrill seeker who wants to learn a new and different language, choose Scala, R or Go, and if you are feeling particularly adventurous, explore the quaint old worlds of Lisp or Prolog. Why is Python so good for AI and Machine Learning? 5 Python Experts Explain Top 6 Java Machine Learning/Deep Learning frameworks you can’t miss 15 Useful Python Libraries to make your Data Science tasks Easier
Read more
  • 0
  • 0
  • 8760

article-image-5-ways-machine-learning-is-transforming-digital-marketing
Amey Varangaonkar
04 Jun 2018
7 min read
Save for later

5 ways Machine Learning is transforming digital marketing

Amey Varangaonkar
04 Jun 2018
7 min read
The enterprise interest in Artificial Intelligence is surging. In an era of cut-throat competition where it’s either do or die, businesses have realized the transformative value of AI to gain an upper hand over their rivals. Given its direct contribution to business revenue, it comes as no surprise that marketing has become one of the major application areas of machine learning. Per Capgemini, 84% of marketing organizations are implementing Artificial Intelligence in 2018, in some capacity 3 out of the 4 organizations implementing AI techniques have managed to increase the sales of their products and services by 10% or more. In this article, we look at 5 innovative ways in which machine learning is being used to enhance digital marketing. Efficient lead generation and customer acquisition One of the major keys to drive business revenue is getting more customers on board who will buy your products or services repeatedly. Machine learning comes in handy to identify potential leads and convert those leads into customers. With the help of the pattern recognition techniques, it is possible to understand a particular lead’s behavioral and purchase trends. Through predictive analytics, it is then possible to predict if a particular lead will buy the product or not. Then, that lead is put into the marketing sales funnel to perform targeted marketing campaigns which may ultimately result into a purchase. A cautionary note here - with GDPR (General Data Protection Regulation) in place across the EU (European Union), there are restrictions in the manner AI algorithms can be used to make automated decisions based on the consumer data. This will make it imperative for the businesses to strictly follow the regulation and operate under its purview, or they could face heavy penalties. As long as businesses respect privacy and follow basic human decency such as asking for permission to use a person’s data or informing them about how their data will be used, marketers can reap the benefits of data driven marketing like never before. It all boils down to applying common sense while handling personal data, as one GDPR expert put it. But we all know how uncommon, that sense is! Customer churn prediction is now possible ‘Customer churn rate’ is a popular marketing term referring to the number of customers who opt out of a particular service offered by the company over a given time period. The churn time is calculated based on the customer’s last interaction with the service or the website. It is crucial to track the churn rate as it is a clear indicator of the progress - or the lack of it - that a business is making. Predicting the customer churn rate is difficult - especially for e-commerce businesses selling a product - but it is not impossible thanks to machine learning. By understanding the historical data and the user’s past website usage patterns, these techniques can help a business identify the customers who are most likely to churn out soon and when that is expected to happen. Appropriate measures can then be taken to retain such customers - by giving special offers and discounts, timely follow-up emails, and so on - without any human intervention. American entertainment giants Netflix make perfect use of churn prediction to keep the churn rate at just 9%, lower than any of the subscription streaming services out there today. Not just that, they also manage to market their services to drive more customer subscriptions. Dynamic pricing made easy In today’s competitive world, products need to be priced optimally. It has become imperative that companies define an extremely competitive and relevant pricing for their products, or else the customers might not buy them. On top of this, there are fluctuations in the demand and supply of the product, which can affect the product’s pricing strategy. With the use of machine learning algorithms, it is now possible to forecast the price elasticity by considering various factors such as the channel on which the product is sold. Other  factors taken into consideration could be the sales period, the product’s positioning strategy or the customer demand. For example, eCommerce giants Amazon and eBay tweak their product prices on a daily basis. Their pricing algorithms take into account factors such as the product’s popularity among the customers, maximum discount that can be offered, and how often the customer has purchased from the website. This strategy of dynamic pricing is now being adopted by almost all the big retail companies even in their physical stores. There are specialized software available which are able to leverage machine learning techniques to set dynamic prices to the products. Competera is one such pricing platform which transforms retail through ongoing, timely, and error-free pricing for category revenue growth and improvements in customer loyalty tiers. To know more about how dynamic pricing actually works, check out this Competitoor article. Customer segmentation and radical personalization Every individual is different, and has unique preferences, likes and dislikes. With machine learning, marketers can segment users into different buyer groups based on a variety of factors such as their product preferences, social media activities, their Google search history and much more. For instance, there are machine learning techniques that can segment users based on who loves to blog about food, or loves to travel, or even which show they are most likely to watch on Netflix! The website can then recommend or market products to these customers accordingly. Affinio is one such platform used for segmenting customers based on their interests. Content and campaign personalization is another widely-recognized use-case of machine learning for marketing. Machine learning algorithms are used to build recommendation systems that take into consideration the user’s online behavior and website usage to analyse and recommend products that he/she is likely to buy. A prime example of this is Google’s remarketing strategy, which tries to reconnect with the customers who leave the website without buying anything by showing them relevant ads across different devices. The best part about recommendation systems is that they are able to recommend two completely different products to two customers with a different usage pattern. Incorporating them within the website has turned out to be a valuable strategy to increase the customer’s loyalty and the overall lifetime value. Improving customer experience Gone are the days when the customer who visited a website had to use the ‘Contact Me’ form in case of any query, and an executive would get back with the answer. These days, chatbots are integrated in almost every ecommerce website to answer ad-hoc customer queries, and even suggest them products that fit their criteria. There are live-chat features included in these chatbots as well, which allow the customers to interact with the chatbots and understand the product features before they buy any product. For example, IBM Watson has a really cool feature called the Tone Analyzer. It parses the feedback given by the customer and identifies the tone of the feedback - if it’s angry, resentful, disappointed, or happy. It is then possible to take appropriate measures to ensure that the disgruntled customer is satisfied, or to appreciate the customer’s positive feedback - whatever may be the case. Marketing will only get better with machine learning Highly accurate machine learning algorithms, better processing capabilities and cloud-based solutions are now making it possible for companies to get the most out of AI for their marketing needs. Many companies have already adopted machine learning to boost their marketing strategy, with major players such as Google and Facebook already leading the way. Safe to say many more companies - especially small and medium-sized businesses - are expected to follow suit in the near future. Read more How machine learning as a service is transforming cloud Microsoft Open Sources ML.NET, a cross-platform machine learning framework Active Learning : An approach to training machine learning models efficiently
Read more
  • 0
  • 30
  • 3184

article-image-restful-apis-cloud-iot-social-media-emerging-technologies
Pavan Ramchandani
01 Jun 2018
13 min read
Save for later

What RESTful APIs can do for Cloud, IoT, social media and other emerging technologies

Pavan Ramchandani
01 Jun 2018
13 min read
Two decades ago, the IT industry saw tremendous opportunities with the dot-com boom. Similar to the dot-com bubble, the IT industry is transitioning through another period of innovation. The disruption is seen in major lines of business with the introduction of recent technology trends like Cloud services, Internet of Things (IoT), single-page applications, and social media. In this article, we have covered the role and implications of RESTful web APIs in these emerging technologies. This article is an excerpt from a book written by Balachandar Bogunuva Mohanram, titled RESTful Java Web Services, Second Edition. Cloud services We are in an era where business and IT are married together. For enterprises, thinking of a business model without an IT strategy is becoming out of the equation. Keeping the interest on the core business, often the challenge that lies ahead of the executive team is optimizing the IT budget. Cloud computing has come to the rescue of the executive team in bringing savings to the IT spending incurred for running a business. Cloud computing is an IT model for enabling anytime, anywhere, convenient, on-demand network access to a shared pool of configurable computing resources. In simple terms, cloud computing refers to the delivery of hosted services over the internet that can be quickly provisioned and decommissioned with minimal management effort and less intervention from the service provider. Cloud characteristics Five key characteristics deemed essential for cloud computing are as follows: On-demand Self-service: Ability to automatically provision cloud-based IT resources as and when required by the cloud service consumer Broad Network Access: Ability to support seamless network access for cloud-based IT resources via different network elements such as devices, network protocols, security layers, and so on Resource Pooling: Ability to share IT resources for cloud service consumers using the multi-tenant model Rapid Elasticity: Ability to dynamically scale IT resources at runtime and also release IT resources based on the demand Measured Service: Ability to meter the service usage to ensure cloud service consumers are charged only for the services utilized Cloud offering models Cloud offerings can be broadly grouped into three major categories, IaaS, PaaS, and SaaS, based on their usage in the technology stack: Software as a Service (SaaS) delivers the application required by an enterprise, saving the costs an enterprise needs to procure, install, and maintain these applications, which will now be offered by a cloud service provider at competitive pricing Platform as a Service (PaaS) delivers the platforms required by an enterprise for building their applications, saving the cost the enterprise needs to set up and maintain these platforms, which will now be offered by a cloud service provider at competitive pricing Infrastructure as a Service (IaaS) delivers the infrastructure required by an enterprise for running their platforms or applications, saving the cost the enterprise needs to set up and maintain the infrastructure components, which will now be offered by a cloud service provider at competitive pricing RESTful APIs' role in cloud services RESTful APIs can be looked on as the glue that connects the cloud service providers and cloud service consumers. For example, application developers requiring to display a weather forecast can consume the Google Weather API. In this section, we will look at the applicability of RESTful APIs for provisioning resources in the cloud. For an illustration of RESTful APIs, we will be using the Oracle Cloud service platform. Users can set up a free trial account via https://Cloud.oracle.com/home and try out the examples discussed in the following sections. For example, we will try to set up a test virtual machine instance using the REST APIs. The high-level steps required to be performed are as follows: Locate REST API endpoint Generate authentication cookie Provision virtual machine instance Locating the REST API endpoint Once users have signed up for an Oracle Cloud account, they can locate the REST API endpoint to be used by navigating via the following steps: Login screen: Choose the relevant Cloud Account details and click the My Services button as shown in the screenshot ahead: Home page: Displays the cloud services Dashboard for the user. Click the Dashboard icon as shown in the following screenshot: Dashboard screen: Lists the various cloud offerings. Click the Compute Classic offering: Compute Classic screen: Displays the details of infrastructure resources utilized by the user: Site Selector screen: Displays the REST endpoint: Generating an authentication cookie Authentication is required for provisioning the IT resources. For this purpose, we will be required to generate an authentication cookie using the Authenticate User REST API. The details of the API are as follows: API detailsDescriptionAPI functionAuthenticate supplied user credential and generate authentication cookie for use in the subsequent API calls.Endpoint <RESTEndpoint captured in previous section>/authenticate/ Example: https://compute.eucom-north-1.oracleCloud.com/authenticate/ HTTP method POST Request header properties Content-Type:application/oracle-compute-v3+jsonAccept: application/oracle-compute-v3+jsonRequest body user: Two-part name of the user in the format/Computeidentity_domain/user password: Password for the specified userSample request:{ "password": "xxxxx", "user": "/Compute-586113456/test@gmail.com" } Response header properties set-cookie: Authentication cookie value The following screenshot shows the authentication cookie generated by invoking the Authenticate User REST API via the Postman tool: Provisioning a virtual machine instance Consumer are allowed to provision IT resources on the Oracle Compute Cloud infrastructure service, using the LaunchPlans or Orchestration REST API. For this demonstration, we will use the LaunchPlans REST API. The details of the API are as follows: API functionLaunch plan used to provision infra resources in Oracle Compute Cloud Service.Endpoint <RESTEndpoint captured in above section>/launchplan/ Example: https://compute.eucom-north-1.oracleCloud.com/launchplan/ HTTP method POST Request header properties Content-Type:application/oracle-compute-v3+json Accept: application/oracle-compute-v3+json Cookie: <Authentication cookie> Request body instances: Array of instances to be provisioned. For details of properties required by each instance, refer to http://docs.oracle.com/en/Cloud/iaas/compute-iaas-Cloud/stcsa/op-launchplan--post.html. relationships: Mention if any relationship with other instances. Sample Request: { "instances": [ { "shape": "oc3", "imagelist": "/oracle/public/oel_6.4_2GB_v1", "name": "/Compute-586113742/test@gmail.com/test-vm-1", "label": "test-vm-1", "sshkeys":[] } ] } Response body Provisioned list of instances and their relationships The following screenshot shows the creation of a test virtual machine instance by invoking the LaunchPlan REST API via the Postman tool: HTTP Response Status 201 confirms the request for provisioning was successful. Check the provisioned instance status via the cloud service instances page as shown here: Internet of things The Internet of Things (IoT), as the name says, can be considered as a technology enabler for things (which includes people as well) to connect or disconnect from the internet. The term IoT was first coined by Kelvin Ashton in 1999. With broadband Wi-Fi becoming widely available, it is becoming a lot easier to connect things to the internet. This a has a lot of potential to enable a smart way of living and already there are many projects being spoken about smart homes, smart cities, and so on. A simple use case can be predicting the arrival time of a bus so that commuters can get a benefit, if there are any delays and plan accordingly. In many developing countries, the transport system is enabled with smart devices which help commuters predict the arrival or departure time for a bus or train precisely. Gartner analysts firm has predicted that more than 26 billion devices will be connected to the internet by 2020. The following diagram from Wikipedia shows the technology roadmap depicting the applicability of the IoT by 2020 across different areas:   IoT platform The IoT platform consists of four functional layers—the device, data, integration, and service layers. For each functional layer, let us understand the capabilities required for the IoT platform: Device Device management capabilities supporting device registration, provisioning, and controlling access to devices. Seamless connectivity to devices to send and receive data. Data Management of huge volume of data transmitted between devices. Derive intelligence from data collected and trigger actions. Integration Collaboration of information between devices.ServiceAPI gateways exposing the APIs. IoT benefits The IoT platform is seen as the latest evolution of the internet, offering various benefits as shown here: The IoT is becoming widely used due to lowering cost of technologies such as cheap sensors, cheap hardware,  and low cost of high bandwidth network. The connected human is the most visible outcome of the IoT revolution. People are connected to the IoT through various means such as Wearables, Hearables, Nearables, and so on, which can be used to improve the lifestyle, health, and wellbeing of human beings: Wearables: Wearables are any form of sophisticated, computer- like technology which can be worn or carried by a person, such as smart watches, fitness devices, and so on. Hearables: Hearables are wireless computing earpieces, such as headphones. Nearables: Nearables are smart objects with computing devices attached to them, such as door locks, car locks, and so on. Unlike Wearables or Hearables, Nearables are static. Also, in the healthcare industry, the IoT-enabled devices can be used to monitor patients' heart rate or diabetes. Smart pills and nanobots could eventually replace surgery and reduce the risk of complications. RESTful APIs' role in the IoT The architectural pattern used for the realization of the majority of the IoT use cases follows the event-driven architecture pattern. The event-driven architecture software pattern deals with the creation, consumption, and identification of events. An event can be generalized to refer the change in state of an entity. For example, a printer device connected to the internet may emit an event when the printer cartridge is low on ink so that the user can order a new cartridge. The following diagram shows the same with different devices connected to the internet:   The common capability required for devices connected to the internet is the ability to send and receive event data. This can be easily accomplished with RESTful APIs. The following are some of the IoT APIs available on the market: Hayo API: The Hayo API is used by developers to build virtual remote controls for the IoT devices in a home. The API senses and transmits events between virtual remote controls and devices, making it easier for users to achieve desired actions on applications by simply manipulating a virtual remote control. Mozilla Battery Status API: The Mozilla Battery Status API is used to monitor system battery levels of mobile devices and streams notification events for changes in the battery levels and charging progress. Its integration allows users to retrieve real-time updates of device battery levels and status. Caret API: The Caret API allows status sharing across devices. The status can be customized as well. Modern web applications Web-based applications have seen a drastic evolution from Web 1.0 to Web 2.0. Web 1.0 sites were designed mostly with static pages; Web 2.0 has added more dynamism to it. Let us take a quick snapshot of the evolution of web technologies over the years. 1993-1995Static HTML Websites with embedded images and minimal JavaScript1995-2000Dynamic web pages driven by JSP, ASP, CSS for styling, JavaScript for client side validations.2000-2008Content Management Systems like Word Press, Joomla, Drupal, and so on.2009-2013Rich Internet Applications, Portals, Animations, Ajax, Mobile Web applications2014 OnwardsSinge Page App, Mashup, Social Web   Single-page applications Single-page applications are web applications designed to load the application in a single HTML page. Unlike traditional web applications, rather than refreshing the whole page for displaying content change, it enhances the user experience by dynamically updating the current page, similar to a desktop application. The following are some of the key features or benefits of single-page applications: Load contents in single page No refresh of page Responsive design Better user experience Capability to fetch data asynchronously using Ajax Capability for dynamic data binding RESTFul API role in single-page applications In a traditional web application, the client requests a URI and the requested page is displayed in the browser. Subsequent to that, when the user submits a form, the submitted form data is sent to the server and the response is displayed by reloading the whole page as follows: Social media Social media is the future of communication that not only lets one interact but also enables the transfer of different content formats such as audio, video, and image between users. In Web 2.0 terms, social media is a channel that interacts with you along with providing information. While regular media is a one-way communication, social media is a two-way communication that asks for one's comments and lets one vote. Social media has seen tremendous usage via networking sites such as Facebook, LinkedIn, and so on. Social media platforms Social media platforms are based on Web 2.0 technology which serves as the interactive medium for collaboration, communication, and sharing among users. We can classify social media platforms broadly based on their usage as follows: Social networking servicesPlatforms where people manage their social circles and interact with each other, such as Facebook.Social bookmarking servicesAllows one to save, organize, and manage links to various resource over the internet, such as StumbleUpon.Social media newsPlatform that allows people to post news or articles, such as reddit.Blogging servicesPlatform where users can exchange their comments on views, such as Twitter.Document sharing servicesPlatform that lets you share your documents, such as SlideShare.Media sharing servicesPlatform that lets you share media contents, such as YouTube.Crowd sourcing servicesObtaining needed services, ideas, or content by soliciting contributions from a large group of people or an online community, such as Ushahidi. Social media benefits User engagement through social media has seen tremendous growth and many companies use social media channels for campaigns and branding. Let us look at various benefits social media offers: Customer relationship managementA company can use social media to campaigns their brand and potentially benefit with positive feedback from customer review.Customer retention and expansionCustomer reviews can become a valuable source of information for retention and also help to add new customers.Market researchSocial media conversations can become useful insight for market research and planning.Gain competitive advantageAbility to get a view of competitors' messages which enables a company to build strategies to handle their peers in the market.Public relationsCorporate news can be conveyed to audience in real time.Cost controlCompared to traditional methods of campaigning, social media offers better advertising at cheaper cost.   RESTful API role in social media Many of the social networks provide RESTful APIs to expose their capabilities. Let us look at some of the RESTful APIs of popular social media services: Social media servicesRESTFul APIReferenceYouTubeAdd YouTube features to your application, including the ability to upload videos, create and manage playlists, and more.https://developers.google.com/youtube/v3/FacebookThe Graph API is the primary way to get data out of, and put data into, Facebook's platform. It's a low-level HTTP-based API that you can use to programmatically query data, post new stories, manage ads, upload photos, and perform a variety of other tasks that an app might implement.https://developers.facebook.com/docs/graph-api/overviewTwitter Twitter provides APIs to search, filter, and create an ads campaign.https://developer.twitter.com/en/docs To summarize, we discussed modern technology trends and the role of RESTful APIs in each of these areas including its implication on the cloud, virtual machines, user experience for various architecture, and building social media applications. To know more about designing and working with RESTful web services, do check out Java RESTful Web Services, Second Edition. Getting started with Django and Django REST frameworks to build a RESTful app How to develop RESTful web services in Spring
Read more
  • 0
  • 0
  • 4209
article-image-deploy-self-service-business-intelligence-qlik-sense
Amey Varangaonkar
31 May 2018
7 min read
Save for later

Best practices for deploying self-service BI with Qlik Sense

Amey Varangaonkar
31 May 2018
7 min read
As part of a successful deployment of Qlik Sense, it is important IT recognizes self-service Business Intelligence to have its own dynamics and adoption rules. The various use cases and subsequent user groups thus need to be assessed and captured. Governance should always be present but power users should never get the feeling that they are restricted. Once they are won over, the rest of the traction and the adoption of other user types is very easy. In this article, we will look at the most important points to keep in mind while deploying self-service with Qlik Sense. The following excerpt is taken from the book Mastering Qlik Sense, authored by Martin Mahler and Juan Ignacio Vitantonio. This book demonstrates useful techniques to design useful and highly profitable Business Intelligence solutions using Qlik Sense. Here's the list of points to be kept in mind: Qlik Sense is not QlikView Not even nearly. The biggest challenge and fallacy is that the organization was sold, by Qlik or someone else, just the next version of the tool. It did not help at all that Qlik itself was working for years on Qlik Sense under the initial product name Qlik.Next. Whatever you are being told, however, it is being sold to you, Qlik Sense is at best the cousin of QlikView. Same family, but no blood relation. Thinking otherwise sets the wrong expectation so the business gives the wrong message to stakeholders and does not raise awareness to IT that self-service BI cannot be deployed in the same fashion as guided analytics, QlikView in this case. Disappointment is imminent when stakeholders realize Qlik Sense cannot replicate their QlikView dashboards. Simply installing Qlik Sense does not create a self-service BI environment Installing Qlik Sense and giving users access to the tool is a start but there is more to it than simply installing it. The infrastructure requires design and planning, data quality processing, data collection, and determining who intends to use the platform to consume what type of data. If data is not available and accessible to the user, data analytics serve no purpose. Make sure a data warehouse or similar is in place and the business has a use case for self-service data analytics. A good indicator for this is when the business or project works with a lot of data, and there are business users who have lots of Excel spreadsheets lying around analyzing it in different ways. That’s your best case candidate for Qlik Sense. IT to monitor Qlik Sense environment rather control IT needs to unlearn to learn new things and the same applies when it comes to deploying self-service. Create a framework with guidelines and principles and monitor that users are following it, rather than limiting them in their capabilities. This framework needs to have the input of the users as well and to be elastic. Also, not many IT professionals agree with giving away too much power to the user in the development process, believing this leads to chaos and anarchy. While the risk is there, this fear needs to be overcome. Users love data analytics, and they are keen to get the help of IT to create the most valuable dashboard possible and ensure it will be well received by a wide audience. Identifying key users and user groups is crucial For a strong adoption of the tool, IT needs to prepare the environment and identify the key power users in the organization and to win them over to using the technology. It is important they are intensively supported, especially in the beginning, and they are allowed to drive how the technology should be used rather than having principles imposed on them. Governance should always be present but power users should never get the feeling they are restricted by it. Because once they are won over, the rest of the traction and the adoption of other user types is very easy. Qlik Sense sells well–do a lot of demos Data analytics, compelling visualizations, and the interactivity of Qlik Sense is something almost everyone is interested in. The business wants to see its own data aggregated and distilled in a cool and glossy dashboard. Utilize the momentum and do as many demos as you can to win advocates of the technology and promote a consciousness of becoming a data-driven culture in the organization. Even the simplest Qlik Sense dashboards amaze people and boost their creativity for use cases where data analytics in their area could apply and create value. Promote collaboration Sharing is caring. This not only applies to insights, which naturally are shared with the excitement of having found out something new and valuable, but also to how the new insight has been derived. People keep their secrets on the approach and methodology to themselves, but this is counterproductive. It is important that applications, visualizations, and dashboards created with Qlik Sense are shared and demonstrated to other Qlik Sense users as frequently as possible. This not only promotes a data-driven culture but also encourages the collaboration of users and teams across various business functions, which would not have happened otherwise. They could either be sharing knowledge, tips, and tricks or even realizing they look at the same slices of data and could create additional value by connecting them together. Market the success of Qlik Sense within the organization If Qlik Sense has had a successful achievement in a project, tell others about it. Create a success story and propose doing demos of the dashboard and its analytics. IT has been historically very bad in promoting their work, which is counterproductive. Data analytics creates value and there is nothing embarrassing about boasting about its success; as Muhammad Ali suggested, it’s not bragging if it’s true. Introduce guidelines on design and terminology Avoiding the pitfalls of having multiple different-looking dashboards by promoting a consistent branding look across all Qlik Sense dashboards and applications, including terminology and best practices. Ensure the document is easily accessible to all users. Also, create predesigned templates with some sample sheets so the users duplicate them and modify them to their liking and extend them, applying the same design. Protect less experienced users from complexities Don’t overwhelm users if they have never developed in their life. Approach less technically savvy users in a different way by providing them with sample data and sample templates, including a library of predefined visualizations, dimensions, or measures (so-called Master Key Items). Be aware that what is intuitive to Qlik professionals or power users is not necessarily intuitive to other users – be patient and appreciative of their feedback, and try to understand how a typical business user might think. For a strong adoption of the tool, IT needs to prepare the environment and identify the key power users in the organization and win them over to using the technology. It is important they are intensively supported, especially in the beginning, and they are allowed to drive how the technology should be used rather than having principles imposed on them. If you found the excerpt useful, make sure you check out the book Mastering Qlik Sense to learn more of these techniques on efficient Business Intelligence using Qlik Sense. Read more How Qlik Sense is driving self-service Business Intelligence Overview of a Qlik Sense® Application’s Life Cycle What we learned from Qlik Qonnections 2018
Read more
  • 0
  • 0
  • 23908

article-image-why-enterprises-love-the-elastic-stack
Pravin Dhandre
31 May 2018
2 min read
Save for later

Why Enterprises love the Elastic Stack

Pravin Dhandre
31 May 2018
2 min read
Business insights has always been a hotspot by companies and with data that keep flowing, growing and becoming fat by the day, analytics need to be quicker, real-time and reliable. Analytics that can’t match up today’s data provide insights that become almost lifeless to market dynamics. The question then is, is there an analytics solution that can tackle the data hydra? Elastic Stack is your answer. It is power packed with tools like Elasticsearch, Kibana, Logstash, X-Pack and Beats that takes data from any source, in any format, and provide instant search, analysis, and visualization in real time. With over 225 million downloads, it is a clear crowd favorite. Enterprises get an addon benefit in using it as a single analytical suite or getting it integrated with other products, delivering real-time actionable insights and decisions every time. Why Enterprises love the Elastic Stack? Some of the common things that enterprises love about the Elastic Stack is its being open source platform. The next thing that IT companies enjoys is its super fast distributed search mechanism that makes your queries run faster and much efficient. Apart from this, its bundling with Kibana and Logstash makes it awesome for IT infrastructure and DevOps teams who can aggregate and analyze billions of logs with ease. Its simple and robust analysis platform provides distinct advantage over Splunk, Solr, Sphinx, Ambar and many other alternative product suites. Also, its SaaS option allows customers to perform log analytics, full text search and application monitoring over the cloud with utmost ease and reasonable pricing. Companies like Amazon, Bloomberg, Ebay, SAP, Citibank, Sony, Mozilla, Wordpress, SalesForce are already been using Elastic Stack, powering their search and analytics to combat their daily business challenges. Whether it is an educational institution, travel agency, e-commerce, or a financial institution, the Elastic stack is empowering millions of companies with real-time metrics, strong analytics, better search experience and high customer satisfaction. How to install Elasticsearch in Ubuntu and Windows How to perform Numeric Metric Aggregations with Elasticsearch CRUD (Create Read, Update and Delete) Operations with Elasticsearch
Read more
  • 0
  • 0
  • 1836