Creating a scalable crawler with ConcurrentBag
This recipe shows you how to scale workload between a number of independent workers that both produce work and process it.
Getting ready
To work through this recipe, you will need Visual Studio 2015. There are no other prerequisites. The source code for this recipe can be found at BookSamples\Chapter6\Recipe4
.
How to do it...
The following steps demonstrate how to scale workload between a number of independent workers that both produce work and process it:
Start Visual Studio 2015. Create a new C# console application project.
In the
Program.cs
file, add the followingusing
directives:using System; using System.Collections.Concurrent; using System.Collections.Generic; using System.Threading.Tasks; using static System.Console;
Add the following code snippet below the
Main
method:static Dictionary<string, string[]> _contentEmulation = new Dictionary<string, string[]>(); static async Task RunProgram() { var bag = new ConcurrentBag<CrawlingTask...