Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News - Single Board Computers

22 Articles
article-image-intel-introduces-cryogenic-control-chip-horse-ridge-for-commercially-viable-quantum-computing
Fatema Patrawala
11 Dec 2019
4 min read
Save for later

Intel introduces cryogenic control chip, ‘Horse Ridge’ for commercially viable quantum computing

Fatema Patrawala
11 Dec 2019
4 min read
On Monday, Intel Labs introduced first of its kind cryogenic control chip codenamed Horse Ridge. According to Intel, Horse Ridge will enable commercially viable quantum computers and speed up development of full-stack quantum computing systems. Intel announced that Horse Ridge will enable control of multiple quantum bits (qubits) and set a clear path toward scaling larger systems. This seems to be a major milestone on the path to quantum practicality. As right now the challenge for quantum computing is that it only works at near-freezing temperatures. Intel is trying to change that with this control chip. As per Intel, Horse Ridge will be able to enable control at very low temperatures, as it will eliminate hundreds of wires going into a refrigerated case that houses the quantum computer. Horse Ridge is developed in partnership with Intel’s research collaborators at QuTech at Delft University of Technology. It is fabricated using Intel’s 22-nanometer FinFET manufacturing technology. The in-house fabrication of these control chips at Intel will dramatically accelerate the company’s ability to design, test, and optimize a commercially viable quantum computer, the company said. “A lot of research has gone into qubits, which can do simultaneous calculations. But Intel saw that controlling the qubits created another big challenge to developing large-scale commercial quantum systems,” states Jim Clarke, Director of quantum hardware, Intel in the official press release . “It’s pretty unique in the community, as we’re going to take all these racks of electronics you see in a university lab and miniaturize that with our 22-nanometer technology and put it inside of a fridge,” added Clarke. “And so we’re starting to control our qubits very locally without having a lot of complex wires for cooling.” The name “Horse Ridge” is inspired from one of the coldest regions in Oregon known as the Horse Ridge. It is designed to operate at cryogenic temperatures, approx 4 degrees Kelvin which is 7 degrees Fahrenheit and 4 degrees Celsius. What is the innovation behind Horse Ridge Quantum computers promise the potential to tackle problems that conventional computers can’t handle by themselves. Quantum computers leverage a phenomenon of quantum physics that allows qubits to exist in multiple states simultaneously. As a result, qubits can conduct a large number of calculations at the same time dramatically speeding up complex problem-solving. But Intel acknowledges the fact that the quantum research community still lags behind in demonstrating quantum practicality, a benchmark to determine if a quantum system can deliver game-changing performance to solve real-world problems. Till date, researchers have focused on building small-scale quantum systems to demonstrate the potential of quantum devices. In these efforts, researchers have relied upon existing electronic tools and high-performance computing rack-scale instruments to connect the quantum system to the traditional computational devices that regulates qubit performance and programs the system inside the cryogenic refrigerator. These devices are often custom designed to control individual qubits, requiring hundreds of connective wires in and out of the refrigerator. However, this extensive control cabling for each qubit hinders the ability to scale the quantum system to the hundreds or thousands of qubits required to demonstrate quantum practicality, not to mention the millions of qubits required for a commercially viable quantum solution. With Horse Ridge, Intel radically simplifies the control electronics required to operate a quantum system. Replacing these bulky instruments with a highly integrated system-on-chip (SoC) will simplify system design and allow for sophisticated signal processing techniques to accelerate set-up time, improve qubit performance, and enable the system to efficiently scale to larger qubit counts. “One option is to run the control electronics at room temperature and run coax cables down to configure the qubits. But you can immediately see that you’re going to run into a scaling problem because you get to hundreds or thousands of cables and it’s not going to work,” said Richard Uhlig, Managing Director Intel Labs. “What we’ve done with Horse Ridge is that it’s able to run at temperatures that are much closer to the qubits themselves. It runs at about 4 degrees Kelvin. The innovation is that we solved the challenges around getting CMOS to run at those temperatures and still have a lot of flexibility in how the qubits are controlled and configured.” To know more about this exciting news, check out the official announcement from Intel. Are we entering the quantum computing era? Google’s Sycamore achieves ‘quantum supremacy’ while IBM refutes the claim The US to invest over $1B in quantum computing, President Trump signs a law Quantum computing, edge analytics, and meta learning: key trends in data science and big data in 2019
Read more
  • 0
  • 0
  • 3815

article-image-amd-competes-with-intel-by-launching-epyc-rome-worlds-first-7-nm-chip-for-data-centers-luring-in-twitter-and-google
Bhagyashree R
09 Aug 2019
5 min read
Save for later

AMD competes with Intel by launching EPYC Rome, world’s first 7 nm chip for data centers, luring in Twitter and Google

Bhagyashree R
09 Aug 2019
5 min read
On Wednesday, Advanced Micro Devices (AMD) unveiled its highly-anticipated second-generation EPYC processor chip for data centers code-named “Rome”. Since the launch, the company has announced its agreements with many tech giants including Intel’s biggest customers, Twitter and Google. Lisa Su, AMD’s president and CEO, said during her keynote at the launch event, “Today, we set a new standard for the modern data center. Adoption of our new leadership server processors is accelerating with multiple new enterprise, cloud and HPC customers choosing EPYC processors to meet their most demanding server computing needs.” EPYC Rome: The world’s first 7nm server chip AMD first showcased the EPYC Rome chip, the world's first 7 nm server processor, at its Next Horizon 2018 event. Based on the Zen 2 microarchitecture, it features up to eight 7 nm-based chiplet processors with a 14 nm-based IO chip in the center interconnected by an Infinity fabric. This chip aims to offer twice the performance per socket and about 4X floating-point performance as compared to the previous generation of EPYC chips. https://www.youtube.com/watch?v=kC3ny3LBfi4 At the launch, one of the performance comparisons based on SPECrate 2017 int-peak benchmark showed the top-of-the-line 64-core AMD Epyc 7742 processor showed double the performance of the top-of-the-line 28-core Intel Xeon Platinum 8280M. Priced at under $ 7,000 it is a lot more affordable than Intel’s chip priced at $13,000. AMD competes with Intel, the dominant supplier of data center chips AMD’s main competitor in the data center chip realm is Intel, which is the dominant supplier of data center chips with more than 90% of the market share. However, AMD was able to capture a small market share with the release of its first-generation EPYC server chips. Coming up with its second-generation chip that is performant yet affordable gives AMD an edge over Intel. Donovan Norfolk, executive director of Lenovo’s data center group, told DataCenter Knowledge, “Intel had a significant portion of the market for a long time. I think they’ll continue to have a significant portion of it. I do think that there are more customers that will look at AMD than have in the past.” The delay in the launch of Intel’s 10 nm chips also might have worked in favor of AMD. After a long wait, it was officially launched earlier this month. Its 7 nm chips are expected to arrive in 2021. Intel fall behind schedule in launching its 10 nm chips has also worked in favor of AMD. Its 7nm chips will most likely arrive in 2021. The EPYC Rome chip has already grabbed the attention of many tech giants. Google is planning to use the EPYC server chip in its internal data centers and also wants to offer it to external developers as part of its cloud computing offerings. Twitter will start using EPYC server in its data centers later this year. Hewlett Packard Enterprise is already using these chips in its three ProLiant servers and plans to have 12 systems by the end of this year. Dell also plans to add second-gen Epyc servers to its portfolio this fall. Following AMD’s customer announcements, Intel shares were down 0.6%  to $46.42 in after-hours trading. Though AMD’s chips are better than Intel’s chips in some computing tasks, they do lag in a few desirable and advanced features. Patrick Moorhead, founder of Moor Insights & Strategy told the Reuters, “Intel chip features for machine learning tasks and new Intel memory technology being with customers such as German software firm SAP SE (SAPG.DE) could give Intel an advantage in those areas.” This news sparked a discussion on Hacker News. A user said, “This is a big win for AMD and for me it reconfirms that their strategy of pushing into the mainstream features that Intel is trying to hold hostage for the "high end" is a good one. Back when AMD first introduced the 64-bit extensions to the x86 architecture and directly challenged Intel who was selling 64 bits as a "high end" feature in their Itanium line, it was a place where Intel was unwilling to go (commoditizing 64-bit processors). That proved pretty successful for them. Now they have done it again by commoditizing "high core count" processors. Each time they do this I wonder if Intel will ever learn that you can't "get away" with selling something for a lot of money that can be made more cheaply forever. ” Another user commented, “I hope AMD turns their attention to machine learning tasks soon not just against Intel but NVIDIA also. The new Titan RTX GPUs with their extra memory and Nvlink allow for some really awesome tricks to speed up training dramatically but they nerfed it by only selling without a blower-style fan making it useless for multi-GPU setups. So the only option is to get Titan RTX rebranded as a Quadro RTX 6000 with a blower-style fan for $2,000 markup. $2000 for a fan. The only way to stop things like this will be competition in the space.” To know more in detail, you can watch the EPYC Rome’s launch event: https://www.youtube.com/watch?v=9Jn9NREaSvc Intel’s 10th gen 10nm ‘Ice Lake’ processor offers AI apps, new graphics and best connectivity Why Intel is betting on BFLOAT16 to be a game changer for deep learning training? Hint: Range trumps Precision. Intel’s new brain inspired neuromorphic AI chip contains 8 million neurons, processes data 1K times faster
Read more
  • 0
  • 0
  • 3294

article-image-intels-10th-gen-10nm-ice-lake-processor-offers-ai-apps-new-graphics-and-best-connectivity
Vincy Davis
02 Aug 2019
4 min read
Save for later

Intel’s 10th gen 10nm ‘Ice Lake’ processor offers AI apps, new graphics and best connectivity

Vincy Davis
02 Aug 2019
4 min read
After a long wait, Intel has officially launched its first 10th generation core processors, code-named ‘Ice Lake’. The first batch contains 11 highly integrated 10nm processors which showcases high-performance artificial intelligence (AI) features and is designed for sleek 2 in 1s and laptops. The ‘Ice Lake’ processors are manufactured on Intel’s 10nm processor and consist of the 14nm chipset in the same carrier. It includes two or four Sunny Cove cores along with Intel’s Gen 11 Graphics processing unit (GPU). The 10nm measure of the processor indicates the size of the transistors used. The 10 nanometer miniscule length also shows the power of the transistor as it is considered that smaller the transistor, better is its power consumption. Read More: Intel unveils the first 3D Logic Chip packaging technology, ‘Foveros’, powering its new 10nm chips, ‘Sunny Cove’ Chris Walker, Intel corporate vice president and general manager of Mobility Client Platforms in the Client Computing Group says that “With broad-scale AI for the first time on PCs, an all-new graphics architecture, best-in-class Wi-Fi 6 (Gig+) and Thunderbolt 3 – all integrated onto the SoC, thanks to Intel’s 10nm process technology and architecture design – we’re opening the door to an entirely new range of experiences and innovations for the laptop.” Intel was supposed to ship the 10nm processors, way back in 2016. Intel CEO Bob Swan says that the delay was due to the “company’s overly aggressive strategy for moving to its next node.” Intel has also introduced a new processor number naming structure for the 10th generation ‘Ice Lake’ processors which indicates the generation and the level of graphics performance of the processor. Image source: Intel What’s new in the 10th generation Intel core processors? Intelligent performance The 10th generation core processors are the first purpose-built processors for AI on laptops and 2 in 1s. They are built for modern AI-infused applications and contains many features such as: Intel Deep Learning Boost, used for specifically boosting flexibility to run complex AI workloads. It has a dedicated instruction set that accelerates neural networks on the CPU for maximum responsiveness. Up to 1 teraflop of GPU engine compute for sustained high-throughput inference applications Intel’s Gaussian & Neural Accelerator (GNA) provides an exclusive engine for background workloads such as voice processing and noise prevention at ultra-low power, for utmost battery life. New graphics With the Iris Plus graphics, the 10th generation core processors imparts double graphic performance in 1080p and higher-level content creation in 4K video editing, application of video filters and high-resolution photo processing. This is the first time that Intel’s Graphics processing unit (GPU) will support VESA’s Adaptive Sync* display standard. It enables a smoother gaming experience across games like Dirt Rally 2.0* and Fortnite*. According to Intel, this is the industry's first integrated GPU to incorporate variable rate shading for better rendering performance, as it uses the Gen11 graphics architecture.  The 10th generation core processors supports the BT.2020* specification, hence it is possible to view a 4K HDR video in a billion colors. Best connectivity With improved board integration, PC manufacturers can innovate on form factor for sleeker designs with Wi-Fi 6 (Gig+) connectivity and up to four Thunderbolt 3 ports. Intel claims this is the “fastest and most versatile USB-C connector available.” In the first batch of 11 'Ice Lake' processors, there are 6 Ice Lake U series and 5 Ice Lake Y series processors. Given below is the complete Ice Lake processors list. Image Source: Intel Intel has revealed that laptops with the 10th generation core processors can be expected in the holiday season this year. The post also states that they will soon release additional products in the 10th generation Intel core mobile processor family due to increased needs in computing. The upcoming processors will “deliver increased productivity and performance scaling for demanding, multithreaded workloads.”   Users love the new 10th generation core processor features and are especially excited about the Gen 11 graphics. https://twitter.com/Tribesigns/status/1133284822548279296 https://twitter.com/Isaacraft123/status/1156982456408596481 Many users are also expecting to see the new processors in the upcoming Mac notebooks. https://twitter.com/ChernSchwinn1/status/1157297037336928256 https://twitter.com/matthewmspace/status/1157295582844575744 Head over to the Intel newsroom page for more details. Apple advanced talks with Intel to buy its smartphone modem chip business for $1 billion, reports WSJ Why Intel is betting on BFLOAT16 to be a game changer for deep learning training? Hint: Range trumps Precision. Intel’s new brain inspired neuromorphic AI chip contains 8 million neurons, processes data 1K times faster
Read more
  • 0
  • 0
  • 4899

article-image-alibabas-chipmaker-launches-open-source-risc-v-based-xuantie-910-processor-for-5g-ai-iot-and-self-driving-applications
Vincy Davis
26 Jul 2019
4 min read
Save for later

Alibaba’s chipmaker launches open source RISC-V based ‘XuanTie 910 processor’ for 5G, AI, IoT and self-driving applications

Vincy Davis
26 Jul 2019
4 min read
Launched in 2018, Alibaba’s chip subsidiary, Pingtouge made a major announcement yesterday. Pingtouge is launching its first product - chip processor XuanTie 910 using the open-source RISC-V instruction set architecture. The XuanTie 910 processor is expected to reduce the costs of related chip production by more than 50%, reports Caixin Global. XuanTie 910, also known as T-Head, will soon be available in the market for commercial use. Pingtouge will also be releasing some of XuanTie 910’s codes on Github for free to help the global developer community to create innovative applications. No release dates have been revealed yet. What are the properties of the XuanTie 910 processor? The XuanTie 910 16-core processor has 7.1 Coremark/MHz and its main frequency can achieve 2.5GHz. This processor can be used to manufacture high-end edge-based microcontrollers (MCUs), CPUs, and systems-on-chip (SOC). It can be used in applications like 5G telecommunication, artificial intelligence (AI), and autonomous driving. XuanTie 910 processor gives 40% increased performance over the mainstream RISC-V instructions and also a 20% increase in terms of instructions. According to Synced, Xuantie 910 has two unconventional properties: It has a 2-stage pipelined out-of-order triple issue processor with two memory accesses per cycle. The processors computing, storage and multi-core capabilities are superior due to an increased extension of instructions. Xuantie 910 can extend more than 50 instructions than RISC-V. Last month, The Verge reported that an internal ARM memo has instructed its staff to stop working with Huawei. With the US blacklisting China’s telecom giant Huawei, and also banning any American company from doing business with them, it seems that ARM is also following the American strategy. Although ARM is based in U.K. and is owned by the Japanese SoftBank group, it does have an “US origin technology”, as claimed in the internal memo. This may be one of the reasons why Alibaba is increasing its efforts in developing RISC-V, so that Chinese tech companies can become independent from Western technologies. A Xuantie 910 processor can assure Chinese companies of a stable future, with no fear of it being banned by Western governments. Other than being cost-effective, RISC-V also has other advantages like more flexibility compared to ARM. With complex licence policies and high power prospect, it is going to be a challenge for ARM to compete against RISC-V and MIPS (Microprocessor without Interlocked Pipeline Stages) processors. A Hacker News user comments, “I feel like we (USA) are forcing China on a path that will make them more competitive long term.” Another user says, “China is going to be key here. It's not just a normal market - China may see this as essential to its ability to develop its technology. It's Made in China 2025 policy. That's taken on new urgency as the west has started cutting China off from western tech - so it may be normal companies wanting some insurance in case intel / arm cut them off (trade disputes etc) AND the govt itself wanting to product its industrial base from cutoff during trade disputes” Some users also feel that it is technology that wins when two big economies continue bringing up innovative technologies. A comment on Hacker News reads, “Good to see development from any country. Obviously they have enough reason to do it. Just consider sanctions. They also have to protect their own market. Anyone that can afford it, should do it. Ultimately it is a good thing from technology perspective.” Not all US tech companies are wary of partnering with Chinese counterparts. Two days ago, Salesforce, an American cloud-based software company announced a strategic partnership with Alibaba. This aims to help Salesforce localize their products in mainland China, Hong Kong, Macau, and Taiwan. This will enable Salesforce customers to market, sell, and operate through services like Alibaba Cloud and Tmall. Winnti Malware: Chinese hacker group attacks major German corporations for years, German public media investigation reveals The US Justice Department opens a broad antitrust review case against tech giants Salesforce is buying Tableau in a $15.7 billion all-stock deal
Read more
  • 0
  • 0
  • 6882

article-image-raspberry-pi-4-has-a-usb-c-design-flaw-some-power-cables-dont-work
Vincy Davis
10 Jul 2019
5 min read
Save for later

Raspberry Pi 4 has a USB-C design flaw, some power cables don't work

Vincy Davis
10 Jul 2019
5 min read
Raspberry Pi 4 was released last month, with much hype and promotions. It has a 1.5GHz quad-core 64-bit ARM Cortex-A72 CPU, three memory options of up to 4GB, full-throughput gigabit Ethernet, and a USB-C port as a power connector. The USB-C power connector was the first of its kind addition in the Pi 4 board. However, four days after its release, Tyler Ward, an electronics and product engineer disclosed that the new Pi4 is not charging when used with an electronically marked or e-marked USB-C cables, the type used by Apple MacBooks and other laptops. Two days ago, Pi's co-creator Eben Upton also confirmed the same. Upton says that, “A smart charger with an e-marked cable will incorrectly identify the Raspberry Pi 4 as an audio adapter accessory, and refuse to provide power.” Upton adds that the technical breakdown of the underlying issue in the Pi 4's circuitry, by Tyler Ward offers a detailed overview of why e-marked USB-C cables won't power the Pi. According to Ward’s blog, “The root cause of the problem is the shared cc pull down resistor on the USB Type-C connector. By looking at the reduced pi schematics, we can see it as R79 which connects to both the CC lines in the connector.” “With most chargers this won’t be an issue as basic cables only use one CC line which is connected through the cable and as a result the pi will be detected correctly and receive power. The problem comes in with e-marked cables which use both CC connections”, he adds.  Ward has suggested some workarounds for this problem, firstly he recommends to use a non e-marked cable, which most USB-C phone charger cables are likely to have, rather than the e-marked cable. Also, the older chargers with A-C cables or micro B to C adaptors will also work if they provide enough power, as these don’t require CC detection to get charged. The complete solution to this problem would be if Pi would, in a future board revision, add a 2nd CC resistor to the board and fix the problem. Another option is to buy the $8/£8 official Raspberry Pi 4 power supply. In a statement to TechRepublic, Upton adds that “It's surprising this didn't show up in our (quite extensive) field testing program.”  Benson Leung, a Google Chrome OS engineer has also criticized Raspberry Pi in a medium blogpost which he has sarcastically titled,“How to design a proper USB-C™ power sink (hint, not the way Raspberry Pi 4 did it)”. Leung has identified two critical mistakes on Raspberry Pi’s part. He says that Raspberry Pi should have copied the figure from the USB-C Spec exactly, instead of designing a new circuit. Leung says that Raspberry Pi “designed this circuit themselves, perhaps trying to do something clever with current level detection, but failing to do it right.” The second mistake, he says, is that they didn’t actually test their Pi4 design with advanced cables. “The fact that no QA team inside of Raspberry Pi’s organization caught this bug indicates they only tested with one kind (the simplest) of USB-C cables.”, he adds. Many users agreed with Leung and  expressed their own views on the faulty USB-C design on the Raspberry Pi 4. They think it’s hard to believe that Raspberry Pi shipped these models before trying it with a MacBook charger. A user on Hacker News comments, “I find it incredible that presumably no one tried using a MacBook charger before this shipped. If they did and didn't document the shortcoming that's arguably just as bad. Surely a not insignificant number of customers have MacBooks? If I was writing some test specs this use case would almost certainly feature, given the MacBook Pro's USB C adapter must be one of the most widespread high power USB C charger designs in existence. Especially when the stock device does not ship with a power supply, not like it was unforeseeable some customers would just use the chargers they already have.” Some are glad that they have not yet ordered their Raspberry Pi 4 yet. https://twitter.com/kb2ysi/status/1148631629088342017 However, some users believe it’s not that big a deal. https://twitter.com/kb2ysi/status/1148635750210183175 A user on Hacker News comments, “Eh, it’s not too bad. I found a cable that works and I’ll stick to it. Even with previous-gen Pis there was always a bit of futzing with cables to find one that has small enough voltage drop to not get power warnings (even some otherwise “good” cables really cheap out on copper). The USB C thing is still an issue, and I’m glad it’ll be fixed, but it’s really not that big of a deal.” No schedule has been disclosed on the release of the revision by Upton nor Raspberry Pi till now. 10+ reasons to love Raspberry Pi You can now install Windows 10 on a Raspberry Pi 3 Raspberry Pi opens its first offline store in England
Read more
  • 0
  • 0
  • 3017

article-image-the-linux-foundation-announces-the-chips-alliance-project-for-deeper-open-source-hardware-integration
Sugandha Lahoti
12 Mar 2019
2 min read
Save for later

The Linux Foundation announces the CHIPS Alliance project for deeper open source hardware integration

Sugandha Lahoti
12 Mar 2019
2 min read
In order to advance open source hardware, the Linux Foundation announced a new CHIPS Alliance project yesterday. Backed by Esperanto, Google, SiFive, and Western Digital, the CHIPS Alliance project “will foster a collaborative environment that will enable accelerated creation and deployment of more efficient and flexible chip designs for use in mobile, computing, consumer electronics, and IoT applications.” The project will help in making open source CPU chip and system-on-a-chip (SoC) design more accessible to the market, by creating an independent entity where companies and individuals can collaborate and contribute resources. It will provide the chip community with access to high-quality, enterprise-grade hardware. This project will include a Board of Directors, a Technical Steering Committee, and community contributors who will work collectively to manage the project. To initiate the process, Google will contribute a Universal Verification Methodology (UVM)-based instruction stream generator environment for RISC-V cores. The environment provides configurable, highly stressful instruction sequences that can verify architectural and micro-architectural corner-cases of designs. SiFive will improve the RocketChip SoC generator and the TileLink interconnect fabric in opensource as a member of the CHIPS Alliance. They will also contribute to Chisel (a new opensource hardware description language), and the FIRRTL intermediate representation specification. SiFive will also maintain Diplomacy, the SoC parameter negotiation framework. Western Digital, another contributor will provide high performance, 9-stage, dual issue, 32-bit SweRV Core, together with a test bench, and high-performance SweRV instruction set simulator. They will also contribute implementations of OmniXtend cache coherence protocol. Looking ahead Dr. Yunsup Lee, co-founder, and CTO, SiFive said in a statement “A healthy, vibrant semiconductor industry needs a significant number of design starts, and the CHIPS Alliance will fill this need.” More information is available at CHIPS Alliance org. Mapzen, an open-source mapping platform, joins the Linux Foundation project Uber becomes a Gold member of the Linux Foundation Intel unveils the first 3D Logic Chip packaging technology, ‘Foveros’, powering its new 10nm chips, ‘Sunny Cove’.
Read more
  • 0
  • 0
  • 3181
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-google-releases-two-new-hardware-products-coral-dev-board-and-a-usb-accelerator-built-around-its-edge-tpu-chip
Sugandha Lahoti
06 Mar 2019
2 min read
Save for later

Google releases two new hardware products, Coral dev board and a USB accelerator built around its Edge TPU chip

Sugandha Lahoti
06 Mar 2019
2 min read
Google teased its new hardware products built around its Edge TPU at the Google Next conference last summer. Yesterday, it officially launched the Coral dev board, a Raspberry-Pi look-alike, which is designed to run machine learning algorithms ‘at the edge’, and a USB accelerator. Coral Development Board The “Coral Dev Board” has a 40-pin header that runs Linux on an i.MX8M with an Edge TPU chip for accelerating TensorFlow Lite. The board also features 8GB eMMC storage, 1GB LPDDR4 RAM, Wi-Fi and Bluetooth 4.1. It has USB 2.0/3.0 ports, 3.5mm audio jack, DSI display interface, MIPI-CSI camera interface, HDMI 2.0a connector, and two Digital PDM microphones. Source: Google Coral dev board can be used as a single-board computer when you need accelerated ML processing in a small form factor.  It can also be used as an evaluation kit for the SOM and for prototyping IoT devices and other embedded systems. This board is available for $149.00. Google has also announced a $25 MIPI-CSI 5-megapixel camera for the dev board. USB Accelerator The USB Accelerator is basically a plug-in USB 3.0 stick to add machine learning capabilities to the existing Linux machines. This 65 x 30 mm accelerator can connect to Linux-based systems via a USB Type-C port. It can also work with a Raspberry Pi board at USB 2.0 speeds. The accelerator is built around a 32-bit, 32MHz Cortex-M0+ chip with 16KB of flash and 2KB of RAM. Source: Google The USB Accelerator is available for $75. Developers can build Machine Learning models for both the devices in TensorFlow Lite. More information is available on Google’s Coral Beta website. Coming soon are the PCI-E Accelerator, for integrating the Edge TPU into legacy systems using a PCI-E interface. Also coming is a fully integrated System-on-Module with CPU, GPU, Edge TPU, Wifi, Bluetooth, and Secure Element in a 40mm x 40mm pluggable module. Google expands its machine learning hardware portfolio with Cloud TPU Pods (alpha). Intel acquires eASIC, a custom chip (FPGA) maker for IoT, cloud and 5G environments Raspberry Pi launches it last board for the foreseeable future: the Raspberry Pi 3 Model A+ available now at $25.
Read more
  • 0
  • 0
  • 16620

article-image-you-can-now-install-windows-10-on-a-raspberry-pi-3
Prasad Ramesh
14 Feb 2019
2 min read
Save for later

You can now install Windows 10 on a Raspberry Pi 3

Prasad Ramesh
14 Feb 2019
2 min read
The WoA Installer for Raspberry Pi 3 enables installing Windows 10 on the credit card size computer. The WoA Installer for Raspberry Pi 3 is made by the same members who brought Windows 10 ARM to the Lumia 950 and 950 XL. Where to start? To get started, you need Raspberry Pi 3 Model B or B+, a microSD card of at least class 1, and a Windows 10 ARM64 Image which you can get from GitHub. You also need a recent version of Windows 10 and .NET Framework 4.6.1. The WoA Installer is just a tool which helps you to deploy Windows 10 on the Raspberry Pi 3. WoA Installer needs the Core Package in order to run. You can find them listed on the GitHub page. Specification comparison Regarding specifications, the minimum requirements for Windows 10 is: Processor: 1 gigahertz (GHz) or faster processor or SoC. RAM: 1 gigabyte (GB) for 32-bit or 2 GB for 64-bit. Hard disk space: 16 GB for 32-bit OS 20 GB for 64-bit OS The Raspberry Pi 3B+ has specifications just good enough to run Windows 10: SoC: Broadcom BCM2837B0 quad-core A53 (ARMv8) 64-bit @ 1.4GHz RAM: 1GB LPDDR2 SDRAM While this sounds good, a Hacker news user points out: “Caution: To do this you need to run a rat's nest of a batch file that runs a bunch of different code obtained from the web. If you're going to try this, try on devices you don't care about. Or spend innumerable hours auditing code. Pass -- for now.” You can check out the GitHub page for more instructions. Raspberry Pi opens its first offline store in England Introducing Strato Pi: An industrial Raspberry Pi Raspberry Pi launches it last board for the foreseeable future: the Raspberry Pi 3 Model A+ available now at $25
Read more
  • 0
  • 0
  • 7597

article-image-raspberry-pi-opens-its-first-offline-store-in-england
Prasad Ramesh
08 Feb 2019
2 min read
Save for later

Raspberry Pi opens its first offline store in England

Prasad Ramesh
08 Feb 2019
2 min read
Raspberry Pi has opened a retail brick and mortar store in Cambridge, England. The mini computer maker has always sold its products online and ships to many countries. This offline store is a first for the company. Located at the Grand Arcade shopping, the Raspberry Pi store was started yesterday. It is not just a boring store with Raspberry Pi boards. Their collection includes boards, full setups with monitors, keyboards and mouses for demo, books, mugs and even soft toys with Raspberry branding. You can see some of the pictures of the new store here: https://twitter.com/Raspberry_Pi/status/1093454153534398464 A user shared his observation of the store on HackerNews: “I had a minute to check it out over lunch - most of the floorspace is dedicated to demonstrating what the raspberry pi can do at a high level. They had stations for coding, gaming, sensors, etc. but only ~1/4th of the space was devoted to inventory. They have a decent selection of Pis, sensor kits, and accessories. Not everyone working there was technical. This is definitely aimed at the general public.” Raspberry Pi has a strong online community with people coming up with various DIY projects. But the community is limited to people who have a keen interest on. More stores like this will help familiarize more people with Raspberry Pi. With branded books, demos, and toys this store is aimed to popularize the mini computer. Introducing Strato Pi: An industrial Raspberry Pi Raspberry Pi launches it last board for the foreseeable future: the Raspberry Pi 3 Model A+ available now at $25 Introducing Raspberry Pi TV HAT, a new addon that lets you stream live TV
Read more
  • 0
  • 0
  • 3638

article-image-htc-intel-lenovo-showcase-their-products-at-day-2-of-ces-2019
Sugandha Lahoti
08 Jan 2019
4 min read
Save for later

HTC, Intel, Lenovo showcase their products at Day 2 of CES 2019

Sugandha Lahoti
08 Jan 2019
4 min read
CES 2019 is kicking off in Las Vegas, Nevada today, January 8, Monday for 3 days. The conference unofficially kicked off on Sunday, January 6 and you may have a look at the announcements made on that day. Yesterday was the main press day when the majority of the announcements were made with a lot of companies showcasing their latest projects and announcing new products, software, and services. HTC HTC announced their partnership with Mozilla, bringing Firefox’s virtual reality web browser to the Vive headset. Mozilla first announced Firefox Reality as a dedicated VR web browser in April. In September, they announced that the browser is now available on Viveport, Oculus, and Daydream. Now, it is available for the HTC Vive headset. As part of the deal, HTC is also teaming up with Amazon to make use of Amazon Sumerian. HTC also announced the Vive Pro Eye virtual reality headset with native, built-in eye tracking. It uses “foveated rendering” to render sharp images for wherever the human eye is looking in a virtual scene and reduces the image quality of objects on the periphery. Intel Intel made a number of announcements at CES 2019. They showcased new processors and also released a press release with updates on Project Athena. With this project, they are getting PC makers ready for “a new class of advanced laptops.” These laptops will be Ultrabooks part two with 5G and artificial intelligence support. New Intel processors: New 9th Gen Core processors for a limited number of desktops and laptops. A 10nm Ice Lake processor for thin laptops. A 10nm Lakefield processor using 3D stacking technology for very small computers and tablets. A 10nm Cascade Lake Xeon processor for data processing. 3D Athlete Tracking tech which runs on the Cascade Lake chip and shows data about how fast and far athletes are traveling. Intel’s 10nm Snow Ridge SOC for 5G base stations. Lenovo Lenovo has made minor updates to their ThinkPad X1 Carbon and X1 Yoga laptops with new designs for 2019. They are mostly going to have a material change, and are also going to be thinner and lighter this year. Lenovo has also released two Lenovo is two large display monitor. The first is Lenovo’s ThinkVision P44W, which is aimed at business users, and the second is the Legion Y44w Gaming Monitor. They both have a 43.4-inch panel. Uber One of Uber’s partners in the air taxi domain, Bell, has revealed the design of its vertical takeoff and landing air taxi at CES 2019. Their flying taxi, dubbed the Bell Nexus, can accommodate up to 5 people and is a hybrid-electric powered vehicle. CES 2019 also saw the release of the game Marvel's Avengers: Rocket's Rescue Run. This is the first demo product from startup Holoride, which has Audi as one of its stakeholders. It's the result of Audi and Disney's new media format, which aims to bring virtual reality to passengers in cars, specifically to Uber. More announcements: Harley-Davidson gave a preview of their first all-electric motorcycle. It will launch in August 2019 and will cost $29,799 TCL announced its first soundbars and a 75-inch version of the excellent 6-Series 4K Roku TV Elgato announced a professional $199 light rig for Twitch streamers and YouTube creators Hisense announces its new 2019 4K TV lineup and the Sonic One TV Griffin introduces new wireless chargers for the iPhone and Apple Watch Amazon is planning to let people deliver packages inside your garage Kodak’s release a new instant camera and printer line GE announced a 27-inch smart display for the kitchen that streams Netflix. Google Assistant will soon be on a billion devices. Their next stop - feature phones Vizio announces the most advanced 4K TV ever and support for Apple’s AirPlay 2 Toyota shared details of it’s Guardian Driver-Assist System which will mimic a technique used in fighter jets to serve as a smart intermediary between driver and car. CES 2019: Top announcements made so far HTC Vive Focus 2.0 update promises long battery life, among other things for the VR headset Intel unveils the first 3D Logic Chip packaging technology, ‘Foveros’, powering its new 10nm chips, ‘Sunny Cove’
Read more
  • 0
  • 0
  • 2702
article-image-fcc-grants-google-the-approval-for-deploying-soli-their-radar-based-hand-motion-sensor
Amrata Joshi
02 Jan 2019
3 min read
Save for later

FCC grants Google the approval for deploying Soli, their radar-based hand motion sensor

Amrata Joshi
02 Jan 2019
3 min read
Today, Reuters reported that Google has won an approval from U.S. regulators for deploying a radar-based motion sensing device known as project Soli. According to the report, the Federal Communications Commission (FCC) specified in an order late on Monday that it will grant Google a waiver for operating the Soli sensors at higher power levels than currently allowed. The FCC said, “the decision will serve the public interest by providing for innovative device control.” These Soli sensors help in capturing motion in 3D space with a radar beam for enabling touchless control of functions that help users with mobility and speech impairments. According to Google, the Soli sensor can allow users to press an invisible button between the thumb and index fingers or a virtual dial that turns by rubbing a thumb against the index finger. Google says“even though these controls are virtual, the interactions feel physical and responsive”, as the feedback is generated by the haptic sensation of fingers touching. According to Google, the virtual tools can approximate the precision of natural human hand motion and the Soli sensor can be embedded in wearables, computers, phones, and vehicles including aircraft. Last year in March, Google asked the FCC to allow its short-range interactive motion-sensing Soli radar to operate in the 57 to 64 GHz frequency band at power levels consistent with European Telecommunications Standards Institute standards. In July, Facebook Inc raised concerns that the Soli sensors operating in the spectrum band at higher power levels might have issues coexisting with other technologies. In September, post the discussions, Reuters writes, Google and Facebook collectively told the FCC agreeing that the sensors could operate at higher than currently allowed power levels without interference. In September, Facebook told FCC that it expected a “variety of use cases to develop with respect to new radar devices, including Soli.” Users are excited about this news and are appreciating Google’s efforts towards experimenting something new. One user commented on HackerNews, “Go big or go home. That's one thing I like about Google/Alphabet. Not being afraid to try completely new things outside the normal comfort zone of their traditional product space, and aiming for the most radical potentially game-changing ones at that.” Another user commented on Reddit, “Will be interesting to see if this makes it into new watches/phones by the end of the year.” This news was first reported by  Reuters. Google Cloud releases a beta version of SparkR job types in Cloud Dataproc Google shares initiatives towards enforcing its AI principles; employs a formal review structure for new projects France to levy digital services tax on big tech companies like Google, Apple, Facebook, Amazon in the new year  
Read more
  • 0
  • 0
  • 2585

article-image-intel-unveils-the-first-3d-logic-chip-packaging-technology-foveros-powering-its-new-10nm-chips-sunny-cove
Savia Lobo
13 Dec 2018
3 min read
Save for later

Intel unveils the first 3D Logic Chip packaging technology, ‘Foveros’, powering its new 10nm chips, ‘Sunny Cove’

Savia Lobo
13 Dec 2018
3 min read
Yesterday, the chip manufacturing giant unleashed Foveros, its news 3-D packaging technology, which makes it easy to stack logic chips over one another. Intel claims users can see the first products to use Foveros in the second half of next year. Talking about the stacking logic, Raja Koduri, Intel’s chief architect, said, “You can pack more transistors in a given space. And also you can pack different kinds of transistors; if you want to put a 5G radio right on top of a CPU, solving the stacking problem would be great, because you have all of your functionality but also a small form factor.” With the Foveros technology, Intel will allow for smaller "chiplets," which describes fast logic chips sitting atop a base die that handles power, I/O and power delivery. This project will also help Intel overcome one of its biggest challenges, i.e, building full chips at 10nm scale. The Forveros backed product will be a 10 nanometer compute element on a base die, typically used in low-power devices. Source: Intel  Sunny Cove: Intel’s codename for the new 10nm chips Sunny Cove will be at the heart of Intel’s next-generation Core and Xeon processors which would be available in the latter half of next year. According to Intel, Sunny Cove will provide users with an improved latency and will allow for more operations to be executed in parallel (thus acting more like a GPU). On the graphics front, Intel’s also got new Gen11 integrated graphics “designed to break the 1 TFLOPS barrier,” which will be part of these Sunny Cove chips. Intel also promises improved speeds in AI related tasks, cryptography, and machine learning among other new features with the CPUs. According to a detailed report by Ars Technica, “Sunny Cove makes the first major change to x64 virtual memory support since AMD introduced its x86-64 64-bit extension to x86 in 2003. Bits 0 through 47 are used, with the top 16 bits, 48 through 63, all copies of bit 47. This limits virtual address space to 256TB. These systems can also support a maximum of 256TB of physical memory.” Starting from the second half of next year, everything from mobile devices to data centers may feature Foveros processors over time. “The company wouldn't say where, exactly, the first Foveros-equipped chip will end up, but it sounds like it'll be ideal for incredibly thin and light machines”, Engadget reports. To know more about this news in detail, visit Intel Newsroom. Microsoft Azure reportedly chooses Xilinx chips over Intel Altera for AI co-processors, says Bloomberg report Apple T2 security chip has Touch ID, Security Enclave, hardware to prevent microphone eavesdropping, amongst many other features! How the Titan M chip will improve Android security  
Read more
  • 0
  • 0
  • 3427

article-image-librepcb-0-1-0-released-with-major-changes-in-library-editor-and-file-format
Amrata Joshi
03 Dec 2018
2 min read
Save for later

LibrePCB 0.1.0 released with major changes in library editor and file format

Amrata Joshi
03 Dec 2018
2 min read
Last week, the team at LibrePCB released LibrePCB 0.1.0., a free EDA (Electronic Design Automation) software used for developing printed circuit boards. Just three weeks ago, LibrePCB 0.1.0 RC2 was released with major changes in library manager, control panel, library editor, schematic editor and more. The key features of LibrePCB include, cross-platform (Unix/Linux, Mac OS X, Windows), all-in-one (project management, library/schematic/board editors) and intuitive, modern and easy-to-use graphical user interface. It also features powerful library designs and human-readable file formats. What’s new in LibrePCB 0.1.0 ? Library editor This new version saves library URL. LibrePCB 0.1.0 has come with improvements to saving of component property, schematic-only. File format stability Since this new release of LibrePCB  is a stable one, the file format is stable. The projects created with this version will be loadable with LibrePCB’s future releases. Users are comparing LibrePCB 0.1.0 with KiCad, a free open source EDA software for OSX, Linux and Windows, and they have questions as to which one is better. But many users think that LibrePcb 0.1.0 is better because the part libraries are managed well. Whereas, KiCad doesn’t have a coherent workflow for managing the part libraries. It is difficult to manage the parts like a schematic symbol, its footprint, its 3D model in KiCad. Read more about this news, in detail, on the LibrePCB blog. A libre GPU effort based on RISC-V, Rust, LLVM and Vulkan by the developer of an earth-friendly How to secure your Raspberry Pi board [Tutorial] Nvidia unveils a new Turing architecture: “The world’s first ray tracing GPU”
Read more
  • 0
  • 0
  • 2401
article-image-introducing-strato-pi-an-industrial-raspberry-pi
Prasad Ramesh
26 Nov 2018
4 min read
Save for later

Introducing Strato Pi: An industrial Raspberry Pi

Prasad Ramesh
26 Nov 2018
4 min read
Italian companies have designed Strato Pi, a Raspberry Pi based board intended to be used in industrial applications. It can be used in areas where a higher level of reliability is required. Source: sferlabs website Strato Pi features The board is roughly the same size of Regular Raspberry Pi 2/3 and is engineered to work in an industrial environment that demands more rugged devices. Power supply that can handle harsh environments The Strato Pi can accept a power supply from a wide range and can handle substantial amounts of ripple, noise and voltage fluctuations. The power supply circuit is heavily protected and filtered with oversized electrolytic capacitors, diodes, inductors, and a high efficiency voltage regulator. The power converter is based on PWN converted integrated circuits which can provide up to 95% power efficiency and up to 3A continuous current output. Over current limiting, over voltage protection and thermal shutdown are also built-in. The board is also protected against reverse polarity with resettable fuses. There is surge protection up to ±500V/2ohms 1.2/50μs which ensures reliability even in harsh environments. UPS to safeguard against power failure In database and data collection applications, supper power interruption may cause data loss. To tackle this Strato Pi has an integrated power supply that gives enough time to save data and shutdown when there is a power failure. The battery power supply stage of the board supplies power to the Strato Pi circuits without any interruption even when the main power supply fails. This stage also charges the battery via a high efficiency step-up converter to generate the optimal charging voltage independent of the main power supply voltage value. Built-in real time clock The Strato Pi has a built-in battery-backed real time clock/calendar. It is directly connected to the Raspberry Pi via the I2C bus interface. This shows the correct time even when there is no internet connection. This real time clock is based on the MCP79410 general purpose Microchip RTCC chip. A replaceable CR1025 battery acts as backup power source when the main power is not available. In always powered on state, the battery can last over 10 years. Serial Port Strato Pi uses the interface circuits of the RS-232 and RS-485 serial ports. They are insulated from the main and battery power supply voltages which avoids failures due to ground loops. A proprietary algorithm powered micro-controller, automatically manages the data direction of RS-485. Without any special configuration, the baud rate and the number of bits are taken into account. Thus, the Raspberry board can communicate through its TX/RX lines without any other additional signal. Can Bus The Controller Area Network (CAN) bus is widely used and is based on a multi-master architecture. This board implements an easy to use CAN bus controller. It has both RS-485 and CAN bus ports which can be used at the same time. CAN specification version 2.0B can be used and support of up to 1 Mbps is available. A hardware watchdog A hardware watchdog is an electronic circuit that can automatically reset the processor if there is a software hang. This is implemented with the help of the on board microcontroller. This is independent of  the Raspberry Pi’s internal CPU watchdog circuit. The base variant starts at roughly $88. They also have a mini and products like a prebuilt server. For more details on Strato Pi, sferlabs website. Raspberry Pi launches it last board for the foreseeable future: the Raspberry Pi 3 Model A+ available now at $25 Introducing Raspberry Pi TV HAT, a new addon that lets you stream live TV Intelligent mobile projects with TensorFlow: Build your first Reinforcement Learning model on Raspberry Pi [Tutorial]
Read more
  • 0
  • 0
  • 4983

article-image-raspberry-pi-launches-it-last-board-for-the-foreseeable-future-the-raspberry-pi-3-model-a-available-now-at-25
Prasad Ramesh
16 Nov 2018
2 min read
Save for later

Raspberry Pi launches it last board for the foreseeable future: the Raspberry Pi 3 Model A+ available now at $25

Prasad Ramesh
16 Nov 2018
2 min read
Yesterday, Raspberry launched the Raspberry Pi 3 Model A+ board which is a smaller and cheaper version of the Raspberry Pi 3B+. In 2014, the first gen Raspberry Pi 1 Model B+ was followed by a lighter Model A+ with half the RAM and removed ports. This was able to fit into their Hardware Attached on Top (HAT). Until now there were no such small form factor boards for the Raspberry Pi 2 and 3. Size is cut down but not the features (most of) The Raspberry Pi 3 Model A+ retains most of the features and enhancements as the bigger board of this series. This includes a 1.4GHz 64-bit quad-core ARM Cortex-A53 CPU, 512MB LPDDR2 SDRAM, and dual-band 802.11ac wireless LAN and Bluetooth 4.2/BLE. The enhancements retained are improved USB mass-storage booting and improved thermal management. The entire Raspberry Pi 3 Model A+ board is an FCC certified radio module. This will significantly reduce the cost in conformance testing Raspberry Pi–based products. What is shrunk is the price which is now down to $25 and the board size of 65x56mm, the size of a HAT. Source: Raspberry website Raspberry Pi 3 Model A+ will likely be the last product for now In March this year, Raspberry said that the 3+ platform is the final iteration of the “classic” Raspberry Pi boards. The next steps/released products will be out of necessity and not an evolution. This is because for an evolution to happen Raspberry will need a new core silicon, on a new process node, with new memory technology. So this new board, the 3A+ is about closing things; meaning we won’t see any more products in this line, in the foreseeable future. This board does answer one of their most frequent customer requests for ‘missing products’. And clears their pipeline to focus on building the next generation of Raspberry Pi boards. For more details visit the Raspberry Pi website. Introducing Raspberry Pi TV HAT, a new addon that lets you stream live TV Tensorflow 1.9 now officially supports Raspberry Pi bringing machine learning to DIY enthusiasts Should you go with Arduino Uno or Raspberry Pi 3 for your next IoT project?
Read more
  • 0
  • 0
  • 8484