为什么Backgroundworker+BlockingCollection组合更慢

本文关键字:组合 Backgroundworker+BlockingCollection 为什么 | 更新日期: 2023-09-27 17:57:33

我有一个访问数据库和下载图像的程序。我使用BlockingCollection就是为了这个目的。然而,为了访问一些UI元素,我决定使用BackgroundworkerBlockingCollection的组合。与仅使用Blockingcollection时的速度相比,它显著降低了处理速度。原因是什么?或者,当我现在访问UI元素时,速度降低了?

这是我正在处理的代码:

 private void button_Start_Click(object sender, System.EventArgs e)
    {
        BackgroundWorker bgWorker = new BackgroundWorker();
        bgWorker.DoWork += bw_DoWork;
        bgWorker.RunWorkerCompleted += bw_RunWorkerCompleted;
        bgWorker.ProgressChanged += bw_ProgressChanged;
        bgWorker.WorkerSupportsCancellation = true;
        bgWorker.WorkerReportsProgress = true;
        Button btnSender = (Button)sender;
        btnSender.Enabled = false;
        bgWorker.RunWorkerAsync();
    }

CCD_ 5如下:

{
        HttpWebRequest request = null;
        using (BlockingCollection<ImageFileName> bc = new BlockingCollection<ImageFileName>(30))
        {
            using (Task task1 = Task.Factory.StartNew(() =>
            {
                foreach (var fileName in fileNames)
                {
                        string baseUrl = "http://some url";
                        string url = string.Format(baseUrl, fileName);
                        request = (HttpWebRequest)WebRequest.Create(url);
                        request.Method = "GET";
                        request.ContentType = "application/x-www-form-urlencoded";
                        var response = (HttpWebResponse)request.GetResponse();
                        Stream stream = response.GetResponseStream();
                        img = Image.FromStream(stream);
                        FileNameImage = new ImageFileName(fileName.ToString(), img);
                        bc.Add(FileNameImage);
                        Thread.Sleep(100);
                        Console.WriteLine("Size of BlockingCollection: {0}", bc.Count);
                    }

            }))
            {
                using (Task task2 = Task.Factory.StartNew(() =>
                {

                    foreach (ImageFileName imgfilename2 in bc.GetConsumingEnumerable())
                    {
                        if (bw.CancellationPending == true)
                        {
                            e.Cancel = true;
                            break;
                        }
                        else
                        {
                            int numIterations = 4;
                            Image img2 = imgfilename2.Image;
                            for (int i = 0; i < numIterations; i++)
                            {
                                img2.Save("C:''path" + imgfilename2.ImageName);
                                ZoomThumbnail = img2;
                                ZoomSmall = img2;
                                ZoomLarge = img2;
                                ZoomThumbnail = GenerateThumbnail(ZoomThumbnail, 86, false);
                                ZoomThumbnail.Save("C:''path" + imgfilename2.ImageName + "_Thumb.jpg");
                                ZoomThumbnail.Dispose();
                                ZoomSmall = GenerateThumbnail(ZoomSmall, 400, false);
                                ZoomSmall.Save("C:''path" + imgfilename2.ImageName + "_Small.jpg");
                                ZoomSmall.Dispose();
                                ZoomLarge = GenerateThumbnail(ZoomLarge, 1200, false);
                                ZoomLarge.Save("C:''path" + imgfilename2.ImageName + "_Large.jpg");
                                ZoomLarge.Dispose();
                                //  progressBar1.BeginInvoke(ProgressBarChange);
                                int percentComplete = (int)(((i + 1.0) / (double)numIterations) * 100.0);
                                //if (progressBar1.InvokeRequired)
                                //{
                                //    BeginInvoke(new MethodInvoker(delegate{bw.ReportProgress(percentComplete)};))
                                //}
                            }
                            Console.WriteLine("This is Take part and size is: {0}", bc.Count);
                        }
                    }

                }))
                    Task.WaitAll(task1, task2);

            }
        }
    }

为什么Backgroundworker+BlockingCollection组合更慢

更好的选择可能是使检索数据和将数据写入磁盘同步运行,而不是使用Parallel.ForEach()来允许多个请求同时运行。这应该会减少几个地方的等待量:

  • 无需等待一个HTTP请求完成后再发出后续请求
  • 无需阻止该BlockingCollection
  • 无需等待一次磁盘写入完成后再启动下一次

也许更像这样的东西:

Parallel.ForEach(fileNames, 
    (name) => 
    {
        string baseUrl = "http://some url";
        string url = string.Format(baseUrl, fileName);
        var request = (HttpWebRequest)WebRequest.Create(url);
        request.Method = "GET";
        request.ContentType = "application/x-www-form-urlencoded";
        var response = (HttpWebResponse)request.GetResponse();
        Stream stream = response.GetResponseStream();
        var img = Image.FromStream(stream);
        // Cutting out a lot of steps from the 2nd Task to simplify the example
        img.Save(Path.Combine("C:''path", fileName.ToString()));  
    });

使用这种方法可能会遇到的一个问题是,它将开始一次生成太多请求。这可能会导致资源争用问题,或者Web服务器会将其解释为恶意行为并停止对您的响应。您可以通过设置MaxDegreeOfParallelism来限制同时发生的请求数。以下示例显示了如何将操作限制为同时处理不超过4个文件。

var options = new ParallelOptions { MaxDegreeOfParallelism = 4 };
Parallel.ForEach(fileNames, (name) => { /* do stuff */ }, options);