{"text":"Node.js: Produce docs files","meta":{"source":"GitHub","url":"https://github.com/googleapis/artman/issues/245"},"label":"DOCUMENTATION","_input_hash":1711076198,"_task_hash":-1378074282,"answer":"reject"} {"text":"# Documentation for Virtual Appliance 3.0\n\nUpdate VA technical documentation, and complete release notes, for Virtual Appliance 3.0.","title":"Documentation for Virtual Appliance 3.0","body":"Update VA technical documentation, and complete release notes, for Virtual Appliance 3.0.","html":"

Documentation for Virtual Appliance 3.0

\n\n

Update VA technical documentation, and complete release notes, for Virtual Appliance 3.0.

\n","meta":{"source":"GitHub","url":"https://github.com/ncbo/virtual_appliance/issues/12"},"_input_hash":-1315993967,"_task_hash":-695731818,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# Add Documentation\n\n","title":"Add Documentation","body":"","html":"

Add Documentation

\n","meta":{"source":"GitHub","url":"https://github.com/unicornsden/pixie/issues/9"},"_input_hash":177319935,"_task_hash":-327871891,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"remark-ping: docs mention zestedesavoir, not how to set it up?","meta":{"source":"GitHub","url":"https://github.com/zestedesavoir/zmarkdown/issues/131"},"label":"DOCUMENTATION","_input_hash":-256922106,"_task_hash":-1630045144,"answer":"accept"} {"text":"Update lunr dependency","meta":{"source":"GitHub","url":"https://github.com/Rebilly/ReDoc/issues/310"},"label":"DOCUMENTATION","_input_hash":-1444272421,"_task_hash":38940004,"answer":"reject"} {"text":"# RFC: Remove pages pertaining to old versions of tesseract from wiki\n\nI suggest removing pages pertaining to old versions of tesseract from wiki.\r\n\r\neg. There are multiple pages with instructions for creating box files for tesseract 3.x. These instructions do not apply to tesseract4 and above. Users who follow these and then try to run lstmtraining get unexpected errors.\r\n\r\nHere is the current list of wiki pages:\r\n\r\n```\r\nHome\r\n4.0 Accuracy and Performance\r\n4.0 Docker Containers\r\n4.0 with LSTM\r\n4.0x Changelog\r\n4.0x Common Errors and Resolutions\r\nAddOns\r\nAPIExample\r\nAPIExample user_patterns\r\nCommand Line Usage\r\nCompiling\r\nCompiling \u2013 GitInstallation\r\nControlParams\r\nData Files\r\nData Files Contributions\r\nData Files in different versions\r\nData Files in tessdata_fast\r\nDocumentation\r\nDownloads\r\nFAQ\r\nFAQ Old\r\nFix footer\r\nFonts\r\nImproveQuality\r\nMake Box Files\r\nMaking Box Files 4.0\r\nNeuralNetsInTesseract4.00\r\nPlanning\r\nReadMe\r\nReleaseNotes\r\nTechnical Documentation\r\nTesseractOpenCL\r\nTestingTesseract\r\nThe Hallucination Effect\r\nTraining Tesseract\r\nTraining Tesseract 3.00\u20133.02\r\nTraining Tesseract 3.03\u20133.05\r\nTraining Tesseract \u2013 Make Box Files\r\nTraining Tesseract \u2013 tesstrain.sh\r\nTrainingTesseract\r\nTrainingTesseract 4.00\r\nTrainingTesseract 4.00 Finetune\r\nTrainingTesseract2\r\nUNLV Testing of Tesseract\r\nUser App Example\r\nUser Projects \u2013 3rdParty\r\nVGSLSpecs\r\nViewerDebugging\r\n```","title":"RFC: Remove pages pertaining to old versions of tesseract from wiki","body":"I suggest removing pages pertaining to old versions of tesseract from wiki.\r\n\r\neg. There are multiple pages with instructions for creating box files for tesseract 3.x. These instructions do not apply to tesseract4 and above. Users who follow these and then try to run lstmtraining get unexpected errors.\r\n\r\nHere is the current list of wiki pages:\r\n\r\n```\r\nHome\r\n4.0 Accuracy and Performance\r\n4.0 Docker Containers\r\n4.0 with LSTM\r\n4.0x Changelog\r\n4.0x Common Errors and Resolutions\r\nAddOns\r\nAPIExample\r\nAPIExample user_patterns\r\nCommand Line Usage\r\nCompiling\r\nCompiling \u2013 GitInstallation\r\nControlParams\r\nData Files\r\nData Files Contributions\r\nData Files in different versions\r\nData Files in tessdata_fast\r\nDocumentation\r\nDownloads\r\nFAQ\r\nFAQ Old\r\nFix footer\r\nFonts\r\nImproveQuality\r\nMake Box Files\r\nMaking Box Files 4.0\r\nNeuralNetsInTesseract4.00\r\nPlanning\r\nReadMe\r\nReleaseNotes\r\nTechnical Documentation\r\nTesseractOpenCL\r\nTestingTesseract\r\nThe Hallucination Effect\r\nTraining Tesseract\r\nTraining Tesseract 3.00\u20133.02\r\nTraining Tesseract 3.03\u20133.05\r\nTraining Tesseract \u2013 Make Box Files\r\nTraining Tesseract \u2013 tesstrain.sh\r\nTrainingTesseract\r\nTrainingTesseract 4.00\r\nTrainingTesseract 4.00 Finetune\r\nTrainingTesseract2\r\nUNLV Testing of Tesseract\r\nUser App Example\r\nUser Projects \u2013 3rdParty\r\nVGSLSpecs\r\nViewerDebugging\r\n```","html":"

RFC: Remove pages pertaining to old versions of tesseract from wiki

\n\n

I suggest removing pages pertaining to old versions of tesseract from wiki.

\n\n

eg. There are multiple pages with instructions for creating box files for tesseract 3.x. These instructions do not apply to tesseract4 and above. Users who follow these and then try to run lstmtraining get unexpected errors.

\n\n

Here is the current list of wiki pages:

\n\n

\nHome\n4.0 Accuracy and Performance\n4.0 Docker Containers\n4.0 with LSTM\n4.0x Changelog\n4.0x Common Errors and Resolutions\nAddOns\nAPIExample\nAPIExample user_patterns\nCommand Line Usage\nCompiling\nCompiling \u2013 GitInstallation\nControlParams\nData Files\nData Files Contributions\nData Files in different versions\nData Files in tessdata_fast\nDocumentation\nDownloads\nFAQ\nFAQ Old\nFix footer\nFonts\nImproveQuality\nMake Box Files\nMaking Box Files 4.0\nNeuralNetsInTesseract4.00\nPlanning\nReadMe\nReleaseNotes\nTechnical Documentation\nTesseractOpenCL\nTestingTesseract\nThe Hallucination Effect\nTraining Tesseract\nTraining Tesseract 3.00\u20133.02\nTraining Tesseract 3.03\u20133.05\nTraining Tesseract \u2013 Make Box Files\nTraining Tesseract \u2013 tesstrain.sh\nTrainingTesseract\nTrainingTesseract 4.00\nTrainingTesseract 4.00 Finetune\nTrainingTesseract2\nUNLV Testing of Tesseract\nUser App Example\nUser Projects \u2013 3rdParty\nVGSLSpecs\nViewerDebugging\n

\n","meta":{"source":"GitHub","url":"https://github.com/tesseract-ocr/tesseract/issues/2610"},"_input_hash":173138659,"_task_hash":-127009771,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Ubuntu 16.04 LST installation instructions","meta":{"source":"GitHub","url":"https://github.com/janverschelde/PHCpack/issues/13"},"label":"DOCUMENTATION","_input_hash":528871228,"_task_hash":-1218312113,"answer":"accept"} {"text":"Usage instructions incomplete","meta":{"source":"GitHub","url":"https://github.com/joshua-barber/Docker-powered-development/issues/4"},"label":"DOCUMENTATION","_input_hash":-119786562,"_task_hash":-2145405210,"answer":"accept"} {"text":"GDB debugging fails for TizenRT/Artik053","meta":{"source":"GitHub","url":"https://github.com/Samsung/TizenRT/issues/316"},"label":"DOCUMENTATION","_input_hash":604689336,"_task_hash":-1680783842,"answer":"reject"} {"text":"How to use remotely?","meta":{"source":"GitHub","url":"https://github.com/ShopRunner/jupyter-notify/issues/9"},"label":"DOCUMENTATION","_input_hash":2096963451,"_task_hash":-1991864210,"answer":"reject"} {"text":"# Add support for MSI in Azure App Service\n\nPlease add support for the different MSI endpoint and flow used in the authentication of Azure App Service. This flow uses the `MSI_ENDPONT` and `MSI_SECRET` environment variables.\r\n\r\nhttps://docs.microsoft.com/en-us/azure/app-service/overview-managed-identity#using-the-rest-protocol","title":"Add support for MSI in Azure App Service","body":"Please add support for the different MSI endpoint and flow used in the authentication of Azure App Service. This flow uses the `MSI_ENDPONT` and `MSI_SECRET` environment variables.\r\n\r\nhttps://docs.microsoft.com/en-us/azure/app-service/overview-managed-identity#using-the-rest-protocol","html":"

Add support for MSI in Azure App Service

\n\n

Please add support for the different MSI endpoint and flow used in the authentication of Azure App Service. This flow uses the MSI_ENDPONT and MSI_SECRET environment variables.

\n\n

https://docs.microsoft.com/en-us/azure/app-service/overview-managed-identity#using-the-rest-protocol

\n","meta":{"source":"GitHub","url":"https://github.com/Azure/go-autorest/issues/447"},"_input_hash":-1082266992,"_task_hash":-1209683056,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"RRD dependency","meta":{"source":"GitHub","url":"https://github.com/kytos/documentation/issues/58"},"label":"DOCUMENTATION","_input_hash":453617977,"_task_hash":1732919683,"answer":"reject"} {"text":"# Blender Addon for importing OpenMVG SfM reconstruction results\n\nHi Pierre,\r\n\r\nI am working on a Blender Addon (https://github.com/SBCV/Blender-Addon-Photogrammetry-Importer) that allows you to import different photogrammetry data formats into Blender. The latest version supports now OpenMVG JSON files.\r\n\r\nI can imagine that this tool could be useful for other OpenMVG JSON files users as well. It offers for example a nice way to visualize the reconstruction results (including cameras and image planes).\r\n\r\nOne can use Blender's camera animation tool to render the reconstruction. The camera animation in Blender offers many useful options to define the camera motion. You can define for example the camera path in 3D and add looking constraints for the camera.\r\nFurthermore, you can load different models (with different file formats) into Blender at the same time. So you can for example render a point cloud and the corresponding mesh at the same time (e.g. to highlight differences).\r\nI used it for example to compare reconstruction results of virtual data with the corresponding virtual environment.\r\n\r\nFor Blender users:\r\nYou can use OpenMVG instead of Blender's camera tracking tool to reconstruct the scene. Which is way more comfortable, since Blender's camera tracking requires a lot of user interaction to compute reasonable results.\r\n\r\nJust wanted to inform you about that. Maybe you want to add a reference to the documentation.\r\n\r\nFeel free to close this issue.\r\n\r\nCheers\r\nSebastian","title":"Blender Addon for importing OpenMVG SfM reconstruction results","body":"Hi Pierre,\r\n\r\nI am working on a Blender Addon (https://github.com/SBCV/Blender-Addon-Photogrammetry-Importer) that allows you to import different photogrammetry data formats into Blender. The latest version supports now OpenMVG JSON files.\r\n\r\nI can imagine that this tool could be useful for other OpenMVG JSON files users as well. It offers for example a nice way to visualize the reconstruction results (including cameras and image planes).\r\n\r\nOne can use Blender's camera animation tool to render the reconstruction. The camera animation in Blender offers many useful options to define the camera motion. You can define for example the camera path in 3D and add looking constraints for the camera.\r\nFurthermore, you can load different models (with different file formats) into Blender at the same time. So you can for example render a point cloud and the corresponding mesh at the same time (e.g. to highlight differences).\r\nI used it for example to compare reconstruction results of virtual data with the corresponding virtual environment.\r\n\r\nFor Blender users:\r\nYou can use OpenMVG instead of Blender's camera tracking tool to reconstruct the scene. Which is way more comfortable, since Blender's camera tracking requires a lot of user interaction to compute reasonable results.\r\n\r\nJust wanted to inform you about that. Maybe you want to add a reference to the documentation.\r\n\r\nFeel free to close this issue.\r\n\r\nCheers\r\nSebastian","html":"

Blender Addon for importing OpenMVG SfM reconstruction results

\n\n

Hi Pierre,

\n\n

I am working on a Blender Addon (https://github.com/SBCV/Blender-Addon-Photogrammetry-Importer) that allows you to import different photogrammetry data formats into Blender. The latest version supports now OpenMVG JSON files.

\n\n

I can imagine that this tool could be useful for other OpenMVG JSON files users as well. It offers for example a nice way to visualize the reconstruction results (including cameras and image planes).

\n\n

One can use Blender's camera animation tool to render the reconstruction. The camera animation in Blender offers many useful options to define the camera motion. You can define for example the camera path in 3D and add looking constraints for the camera.\nFurthermore, you can load different models (with different file formats) into Blender at the same time. So you can for example render a point cloud and the corresponding mesh at the same time (e.g. to highlight differences).\nI used it for example to compare reconstruction results of virtual data with the corresponding virtual environment.

\n\n

For Blender users:\nYou can use OpenMVG instead of Blender's camera tracking tool to reconstruct the scene. Which is way more comfortable, since Blender's camera tracking requires a lot of user interaction to compute reasonable results.

\n\n

Just wanted to inform you about that. Maybe you want to add a reference to the documentation.

\n\n

Feel free to close this issue.

\n\n

Cheers\nSebastian

\n","meta":{"source":"GitHub","url":"https://github.com/openMVG/openMVG/issues/1585"},"_input_hash":-1551194699,"_task_hash":1468141772,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# PDFSurface fails with file object.\n\nConsider the following code:\r\n\r\n import cairo\r\n fp = open('out.pdf', 'w')\r\n s = cairo.PDFSurface(fp, 400, 400)\r\n s.finish()\r\n\r\nIt fails with this error message:\r\n\r\n Traceback (most recent call last):\r\n File \"issue.py\", line 4, in \r\n s.finish()\r\n __main__.IOError: error while writing to output stream\r\n\r\nI would expect this code to produce the PDF file `out.pdf`. Replacing the file object by the file path works. Could you figure out, what causes this problem?","title":"PDFSurface fails with file object.","body":"Consider the following code:\r\n\r\n import cairo\r\n fp = open('out.pdf', 'w')\r\n s = cairo.PDFSurface(fp, 400, 400)\r\n s.finish()\r\n\r\nIt fails with this error message:\r\n\r\n Traceback (most recent call last):\r\n File \"issue.py\", line 4, in \r\n s.finish()\r\n __main__.IOError: error while writing to output stream\r\n\r\nI would expect this code to produce the PDF file `out.pdf`. Replacing the file object by the file path works. Could you figure out, what causes this problem?","html":"

PDFSurface fails with file object.

\n\n

Consider the following code:

\n\n
import cairo\nfp = open('out.pdf', 'w')\ns = cairo.PDFSurface(fp, 400, 400)\ns.finish()\n
\n\n

It fails with this error message:

\n\n
Traceback (most recent call last):\n  File \"issue.py\", line 4, in <module>\n    s.finish()\n__main__.IOError: error while writing to output stream\n
\n\n

I would expect this code to produce the PDF file out.pdf. Replacing the file object by the file path works. Could you figure out, what causes this problem?

\n","meta":{"source":"GitHub","url":"https://github.com/pygobject/pycairo/issues/153"},"_input_hash":1205133696,"_task_hash":554879863,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"\"bad marshal data\" when loading model that was saved with python 2.7 into python 3.4.","meta":{"source":"GitHub","url":"https://github.com/fchollet/keras/issues/7440"},"label":"DOCUMENTATION","_input_hash":-1350691810,"_task_hash":-1551093966,"answer":"reject"} {"text":"docker-machine can not create vm","meta":{"source":"GitHub","url":"https://github.com/docker/machine/issues/4206"},"label":"DOCUMENTATION","_input_hash":1706816764,"_task_hash":1865156528,"answer":"reject"} {"text":"Add FAQ's to the documentation","meta":{"source":"GitHub","url":"https://github.com/peterramsing/lost/issues/392"},"label":"DOCUMENTATION","_input_hash":404677717,"_task_hash":673450902,"answer":"accept"} {"text":"Documentation and landing page","meta":{"source":"GitHub","url":"https://github.com/antick/skyii/issues/7"},"label":"DOCUMENTATION","_input_hash":-599082303,"_task_hash":-1451425567,"answer":"accept"} {"text":"# Swap Buttons in GUI\n\n**Describe the feature**\r\nA button in the Companion Admin GUI that would allow two already-programmed functions/buttons to swap places with each other, rather than moving one button and reprogramming the other.\r\n\r\n**Is this platform dependent (windows, mac, ..)?**\r\nNo\r\n\r\n**If documentation is required to implement, do you know where to find it?**\r\nNo, I don't. I do know that there is a similar function in the guts of MA software. When you \"move\" an executor to a fader or button where there is already an executor programmed, the one in the destination moves to where the first executor was originally.\r\n\r\n**Usecases**\r\nIf I have a full Companion page programmed, but decide to change the layout, there are a few options open to me. For the examples, I will say that I want to swap buttons _1.1_ and _1.12_\r\n\r\n**1**) I can move/copy _1.1_ to _2.1_, then move _1.12_ to _1.1_, then move _2.1_ to _1.12_\r\n\r\n**2**) I can export page 1, then import it to page 2. I can then copy _1.1_ to _2.12_, and copy _1.12_ to _2.1_. If I wanted those functions to remain on page 1, I would then export page 2, and import it to page 1 ... and then delete page 2\r\n\r\nBoth of these options are viable, but don't seem very efficient. The efficiency lessens if the next empty page is further away from the page you want to edit. \r\n\r\nHaving a swap function so that I can click on \"Swap\" -> button _1.1_ -> button _1.12_ and have them change places would be time-saving.\r\n\r\nThank you for your consideration of this feature.","title":"Swap Buttons in GUI","body":"**Describe the feature**\r\nA button in the Companion Admin GUI that would allow two already-programmed functions/buttons to swap places with each other, rather than moving one button and reprogramming the other.\r\n\r\n**Is this platform dependent (windows, mac, ..)?**\r\nNo\r\n\r\n**If documentation is required to implement, do you know where to find it?**\r\nNo, I don't. I do know that there is a similar function in the guts of MA software. When you \"move\" an executor to a fader or button where there is already an executor programmed, the one in the destination moves to where the first executor was originally.\r\n\r\n**Usecases**\r\nIf I have a full Companion page programmed, but decide to change the layout, there are a few options open to me. For the examples, I will say that I want to swap buttons _1.1_ and _1.12_\r\n\r\n**1**) I can move/copy _1.1_ to _2.1_, then move _1.12_ to _1.1_, then move _2.1_ to _1.12_\r\n\r\n**2**) I can export page 1, then import it to page 2. I can then copy _1.1_ to _2.12_, and copy _1.12_ to _2.1_. If I wanted those functions to remain on page 1, I would then export page 2, and import it to page 1 ... and then delete page 2\r\n\r\nBoth of these options are viable, but don't seem very efficient. The efficiency lessens if the next empty page is further away from the page you want to edit. \r\n\r\nHaving a swap function so that I can click on \"Swap\" -> button _1.1_ -> button _1.12_ and have them change places would be time-saving.\r\n\r\nThank you for your consideration of this feature.","html":"

Swap Buttons in GUI

\n\n

Describe the feature\nA button in the Companion Admin GUI that would allow two already-programmed functions/buttons to swap places with each other, rather than moving one button and reprogramming the other.

\n\n

Is this platform dependent (windows, mac, ..)?\nNo

\n\n

If documentation is required to implement, do you know where to find it?\nNo, I don't. I do know that there is a similar function in the guts of MA software. When you \"move\" an executor to a fader or button where there is already an executor programmed, the one in the destination moves to where the first executor was originally.

\n\n

Usecases\nIf I have a full Companion page programmed, but decide to change the layout, there are a few options open to me. For the examples, I will say that I want to swap buttons 1.1 and 1.12

\n\n

1) I can move/copy 1.1 to 2.1, then move 1.12 to 1.1, then move 2.1 to 1.12

\n\n

2) I can export page 1, then import it to page 2. I can then copy 1.1 to 2.12, and copy 1.12 to 2.1. If I wanted those functions to remain on page 1, I would then export page 2, and import it to page 1 ... and then delete page 2

\n\n

Both of these options are viable, but don't seem very efficient. The efficiency lessens if the next empty page is further away from the page you want to edit.

\n\n

Having a swap function so that I can click on \"Swap\" -> button 1.1 -> button 1.12 and have them change places would be time-saving.

\n\n

Thank you for your consideration of this feature.

\n","meta":{"source":"GitHub","url":"https://github.com/bitfocus/companion/issues/781"},"_input_hash":-1213161693,"_task_hash":638542890,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Update ConfigMap Docs","meta":{"source":"GitHub","url":"https://github.com/kubernetes/kubernetes.github.io/issues/4509"},"label":"DOCUMENTATION","_input_hash":1393034688,"_task_hash":637065104,"answer":"accept"} {"text":"Samples erroring when loaded locally","meta":{"source":"GitHub","url":"https://github.com/beakable/isometric/issues/20"},"label":"DOCUMENTATION","_input_hash":599938087,"_task_hash":-1086197501,"answer":"reject"} {"text":"Add handlebars support to HTML previews","meta":{"source":"GitHub","url":"https://github.com/hal/hal.next/issues/107"},"label":"DOCUMENTATION","_input_hash":63668977,"_task_hash":-1011493120,"answer":"reject"} {"text":"# Continuous deployment\n\nOur workflow is looking pretty good! We have now set up some automated tests for any new changes. We can go one step further and add continuous deployment to our workflow.\n\n### What is Continuous Deployment?\n\n**Continuous Deployment**, or **CD**, is an extended step that builds from the automation in CI. CD is automation at various stages, deploying new changes to the different environment.\n\nThe goal of CD is to reduce the time it takes to finish a project. Automation provides shorter feedback loops. This could look like faster testing cycles, or faster deployment and user feedback.\n\nThere are several ways to deploy your code changes. For this repository, we'll deploy with GitHub Pages. If you'd like to learn more about GitHub Pages, there are a [few learning lab courses](https://lab.github.com/courses?tag=GitHub%20Pages) you might be interested in.\n\nWhen deploying with GitHub Pages, you can choose to deploy from several locations. We're going to deploy from the `/docs` directory of this repository. This will deploy only the contents of the `/docs` directory.\n\n## Step 12: Deploy\n\nWhenever there is a new commit on `master`, GitHub pages will deploy.\n\n### :keyboard: Activity: Enable GitHub pages to deploy\n\n1. Navigate to the [**Settings**](https://github.com/m-vallance/continuous-integration-circle/settings) tab.\n1. Under GitHub pages, set the source to `master branch` and click **Save**.\n\n
\n

I'll respond below for your next steps.

\n","title":"Continuous deployment","body":"Our workflow is looking pretty good! We have now set up some automated tests for any new changes. We can go one step further and add continuous deployment to our workflow.\n\n### What is Continuous Deployment?\n\n**Continuous Deployment**, or **CD**, is an extended step that builds from the automation in CI. CD is automation at various stages, deploying new changes to the different environment.\n\nThe goal of CD is to reduce the time it takes to finish a project. Automation provides shorter feedback loops. This could look like faster testing cycles, or faster deployment and user feedback.\n\nThere are several ways to deploy your code changes. For this repository, we'll deploy with GitHub Pages. If you'd like to learn more about GitHub Pages, there are a [few learning lab courses](https://lab.github.com/courses?tag=GitHub%20Pages) you might be interested in.\n\nWhen deploying with GitHub Pages, you can choose to deploy from several locations. We're going to deploy from the `/docs` directory of this repository. This will deploy only the contents of the `/docs` directory.\n\n## Step 12: Deploy\n\nWhenever there is a new commit on `master`, GitHub pages will deploy.\n\n### :keyboard: Activity: Enable GitHub pages to deploy\n\n1. Navigate to the [**Settings**](https://github.com/m-vallance/continuous-integration-circle/settings) tab.\n1. Under GitHub pages, set the source to `master branch` and click **Save**.\n\n
\n

I'll respond below for your next steps.

\n","html":"

Continuous deployment

\n\n

Our workflow is looking pretty good! We have now set up some automated tests for any new changes. We can go one step further and add continuous deployment to our workflow.

\n\n

What is Continuous Deployment?

\n\n

Continuous Deployment, or CD, is an extended step that builds from the automation in CI. CD is automation at various stages, deploying new changes to the different environment.

\n\n

The goal of CD is to reduce the time it takes to finish a project. Automation provides shorter feedback loops. This could look like faster testing cycles, or faster deployment and user feedback.

\n\n

There are several ways to deploy your code changes. For this repository, we'll deploy with GitHub Pages. If you'd like to learn more about GitHub Pages, there are a few learning lab courses you might be interested in.

\n\n

When deploying with GitHub Pages, you can choose to deploy from several locations. We're going to deploy from the /docs directory of this repository. This will deploy only the contents of the /docs directory.

\n\n

Step 12: Deploy

\n\n

Whenever there is a new commit on master, GitHub pages will deploy.

\n\n

:keyboard: Activity: Enable GitHub pages to deploy

\n\n
    \n
  1. Navigate to the Settings tab.
  2. \n
  3. Under GitHub pages, set the source to master branch and click Save.
  4. \n
\n\n
\n\n

I'll respond below for your next steps.

\n","meta":{"source":"GitHub","url":"https://github.com/m-vallance/continuous-integration-circle/issues/6"},"_input_hash":-422421118,"_task_hash":1524573671,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Starting documentation","meta":{"source":"GitHub","url":"https://github.com/alexandruchircu/lp-lo/issues/1"},"label":"DOCUMENTATION","_input_hash":1809455977,"_task_hash":-1251401312,"answer":"accept"} {"text":"Create a folder","meta":{"source":"GitHub","url":"https://github.com/ChaoticFuzz/Skill-Fish/issues/3"},"label":"DOCUMENTATION","_input_hash":1864438105,"_task_hash":-201721907,"answer":"reject"} {"text":"HTML page examples","meta":{"source":"GitHub","url":"https://github.com/webtorrent/webtorrent/issues/1165"},"label":"DOCUMENTATION","_input_hash":576735964,"_task_hash":782793983,"answer":"accept"} {"text":"# Building to Subdirectory\n\nHi team! Great job on this project, it's awesome.\r\n\r\nI'm trying to build to a sub directory, so for example:\r\n```\r\nhttps://example.com/docs-site-1\r\n```\r\n\r\nI'm currently using the `baseURL` key in the `.gitdocs.json` file, and that is compiling the webpages in a `docs-site-1` directory fine. However the main bundle and static assets are outside of this directory in the `.gitdocs_build` directory. Is there any way to change this behavior to have it all compiled in the root `.gitdocs_build` directory with the anchor tags pointing to `/docs-site-1/index.html`?\r\n\r\nThanks for any input!","title":"Building to Subdirectory","body":"Hi team! Great job on this project, it's awesome.\r\n\r\nI'm trying to build to a sub directory, so for example:\r\n```\r\nhttps://example.com/docs-site-1\r\n```\r\n\r\nI'm currently using the `baseURL` key in the `.gitdocs.json` file, and that is compiling the webpages in a `docs-site-1` directory fine. However the main bundle and static assets are outside of this directory in the `.gitdocs_build` directory. Is there any way to change this behavior to have it all compiled in the root `.gitdocs_build` directory with the anchor tags pointing to `/docs-site-1/index.html`?\r\n\r\nThanks for any input!","html":"

Building to Subdirectory

\n\n

Hi team! Great job on this project, it's awesome.

\n\n

I'm trying to build to a sub directory, so for example:\n\nhttps://example.com/docs-site-1\n

\n\n

I'm currently using the baseURL key in the .gitdocs.json file, and that is compiling the webpages in a docs-site-1 directory fine. However the main bundle and static assets are outside of this directory in the .gitdocs_build directory. Is there any way to change this behavior to have it all compiled in the root .gitdocs_build directory with the anchor tags pointing to /docs-site-1/index.html?

\n\n

Thanks for any input!

\n","meta":{"source":"GitHub","url":"https://github.com/timberio/gitdocs/issues/171"},"_input_hash":1724424341,"_task_hash":2001424724,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Working AndroidTestOrchestrator example","meta":{"source":"GitHub","url":"https://github.com/googlesamples/android-testing/issues/141"},"label":"DOCUMENTATION","_input_hash":79840122,"_task_hash":-163384476,"answer":"reject"} {"text":"Start an FAQ section","meta":{"source":"GitHub","url":"https://github.com/biosustain/memote/issues/136"},"label":"DOCUMENTATION","_input_hash":198646299,"_task_hash":1439064344,"answer":"accept"} {"text":"Add me please","meta":{"source":"GitHub","url":"https://github.com/githubschool/open-enrollment-classes-introduction-to-github/issues/8998"},"label":"DOCUMENTATION","_input_hash":-1572892862,"_task_hash":-1735421486,"answer":"reject"} {"text":"swagger-document compose - FAILED","meta":{"source":"GitHub","url":"https://github.com/Azure/autorest/issues/2473"},"label":"DOCUMENTATION","_input_hash":-228214282,"_task_hash":1442471523,"answer":"reject"} {"text":"Update Documentation for 1.0.1b","meta":{"source":"GitHub","url":"https://github.com/FCP-INDI/C-PAC/issues/679"},"label":"DOCUMENTATION","_input_hash":-867708390,"_task_hash":-1447951012,"answer":"accept"} {"text":"# Finish Writing Readme\n\n","title":"Finish Writing Readme","body":"","html":"

Finish Writing Readme

\n","meta":{"source":"GitHub","url":"https://github.com/tzulungs/Box_Office_Success/issues/1"},"_input_hash":-1402604640,"_task_hash":-1989640423,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# Make data importer\n\nWrite function to Import json files into python. Might consider using nltools.data.Adjacency class for graphs over time.\r\n\r\nSee here for getting data off of chips using reader. https://github.com/meriac/openbeacon-ng/blob/master/docs/quickstart.md","title":"Make data importer","body":"Write function to Import json files into python. Might consider using nltools.data.Adjacency class for graphs over time.\r\n\r\nSee here for getting data off of chips using reader. https://github.com/meriac/openbeacon-ng/blob/master/docs/quickstart.md","html":"

Make data importer

\n\n

Write function to Import json files into python. Might consider using nltools.data.Adjacency class for graphs over time.

\n\n

See here for getting data off of chips using reader. https://github.com/meriac/openbeacon-ng/blob/master/docs/quickstart.md

\n","meta":{"source":"GitHub","url":"https://github.com/ljchang/openbeacon/issues/1"},"_input_hash":2059198928,"_task_hash":627286934,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# How do you create your own remote collection?\n\nThis sounds like very basic question, but I tried and could not find an answer from my own searching.\r\n\r\nhttps://docs.bit.dev/docs/cli-remote.html implies you can create your own collection and add it as your remote.\r\nhttps://github.com/teambit/bit-docker seems the one I can use it for that purpose.\r\nhttps://github.com/teambit/bit says \"You can set up a collection on any server, or use Bit\u2019s component hub.\"\r\n\r\nBut I have hard time finding exactly how to set up a collection on any server. All documentation I find leads to using bit.dev. Is there a way for me to start this?\r\n\r\nThank you!","title":"How do you create your own remote collection?","body":"This sounds like very basic question, but I tried and could not find an answer from my own searching.\r\n\r\nhttps://docs.bit.dev/docs/cli-remote.html implies you can create your own collection and add it as your remote.\r\nhttps://github.com/teambit/bit-docker seems the one I can use it for that purpose.\r\nhttps://github.com/teambit/bit says \"You can set up a collection on any server, or use Bit\u2019s component hub.\"\r\n\r\nBut I have hard time finding exactly how to set up a collection on any server. All documentation I find leads to using bit.dev. Is there a way for me to start this?\r\n\r\nThank you!","html":"

How do you create your own remote collection?

\n\n

This sounds like very basic question, but I tried and could not find an answer from my own searching.

\n\n

https://docs.bit.dev/docs/cli-remote.html implies you can create your own collection and add it as your remote.\nhttps://github.com/teambit/bit-docker seems the one I can use it for that purpose.\nhttps://github.com/teambit/bit says \"You can set up a collection on any server, or use Bit\u2019s component hub.\"

\n\n

But I have hard time finding exactly how to set up a collection on any server. All documentation I find leads to using bit.dev. Is there a way for me to start this?

\n\n

Thank you!

\n","meta":{"source":"GitHub","url":"https://github.com/teambit/bit/issues/1915"},"_input_hash":-852978005,"_task_hash":-329366398,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Re-open Camera","meta":{"source":"GitHub","url":"https://github.com/TarasOsiris/android-goodies-docs-PRO/issues/2"},"label":"DOCUMENTATION","_input_hash":843668706,"_task_hash":-1224511339,"answer":"reject"} {"text":"lint markdown for style","meta":{"source":"GitHub","url":"https://github.com/coreos/etcd/issues/8310"},"label":"DOCUMENTATION","_input_hash":2008340927,"_task_hash":1118884010,"answer":"reject"} {"text":"Nodemailer not working :(","meta":{"source":"GitHub","url":"https://github.com/nodejs/help/issues/755"},"label":"DOCUMENTATION","_input_hash":1759610411,"_task_hash":246186193,"answer":"reject"} {"text":"Give full schema in docs?","meta":{"source":"GitHub","url":"https://github.com/openml/website/issues/163"},"label":"DOCUMENTATION","_input_hash":-549113260,"_task_hash":-315574834,"answer":"accept"} {"text":"should update teracy-dev development guide","meta":{"source":"GitHub","url":"https://github.com/teracyhq/dev/issues/379"},"label":"DOCUMENTATION","_input_hash":-1272747097,"_task_hash":-521082194,"answer":"accept"} {"text":"Distribute documentation mis-formatted","meta":{"source":"GitHub","url":"https://github.com/pallets/flask/issues/2432"},"label":"DOCUMENTATION","_input_hash":1916231742,"_task_hash":1607688309,"answer":"accept"} {"text":"SocketException[Connected refused] errors","meta":{"source":"GitHub","url":"https://github.com/hazelcast/hazelcast-aws/issues/34"},"label":"DOCUMENTATION","_input_hash":151960383,"_task_hash":1758832543,"answer":"reject"} {"text":"A way to trigger appear animation using Transition","meta":{"source":"GitHub","url":"https://github.com/reactjs/react-transition-group/issues/133"},"label":"DOCUMENTATION","_input_hash":-1025812159,"_task_hash":1147581970,"answer":"reject"} {"text":"Development Environment: ValueError: Protocol message has no non-repeated submessage field \"metadata\"","meta":{"source":"GitHub","url":"https://github.com/tensorflow/tensorboard/issues/271"},"label":"DOCUMENTATION","_input_hash":-1944755051,"_task_hash":-755616676,"answer":"reject"} {"text":"Embed README images","meta":{"source":"GitHub","url":"https://github.com/jtof-fap/verifHostname/issues/1"},"label":"DOCUMENTATION","_input_hash":1585745516,"_task_hash":517171608,"answer":"accept"} {"text":"Design \"resting state\" instructions","meta":{"source":"GitHub","url":"https://github.com/beefoo/climate-lab/issues/20"},"label":"DOCUMENTATION","_input_hash":-1021794729,"_task_hash":623337362,"answer":"accept"} {"text":"Multiple recipients vs. git signing key","meta":{"source":"GitHub","url":"https://github.com/justwatchcom/gopass/issues/224"},"label":"DOCUMENTATION","_input_hash":-277188293,"_task_hash":643886324,"answer":"reject"} {"text":"Which licence is this issued under?","meta":{"source":"GitHub","url":"https://github.com/better-js-logging/angular-logger2/issues/3"},"label":"DOCUMENTATION","_input_hash":-1766614987,"_task_hash":-856987665,"answer":"accept"} {"text":"The aria-labels of checkboxes in the Select component appears as [object Object] if the label is a node.","meta":{"source":"GitHub","url":"https://github.com/grommet/grommet/issues/1514"},"label":"DOCUMENTATION","_input_hash":-161381395,"_task_hash":-1596634573,"answer":"reject"} {"text":"Document git workflow","meta":{"source":"GitHub","url":"https://github.com/mineral-ui/mineral-ui/issues/205"},"label":"DOCUMENTATION","_input_hash":271329105,"_task_hash":202815863,"answer":"accept"} {"text":"Make switching to CuDNN easy","meta":{"source":"GitHub","url":"https://github.com/deeplearning4j/dl4j-examples/issues/493"},"label":"DOCUMENTATION","_input_hash":-1324785194,"_task_hash":-1312965738,"answer":"reject"} {"text":"mvn -Pconfigure-datasource fails to build","meta":{"source":"GitHub","url":"https://github.com/52North/sos/issues/560"},"label":"DOCUMENTATION","_input_hash":-1877785785,"_task_hash":-1953484013,"answer":"reject"} {"text":"Issue when using function convertUTCtoTT","meta":{"source":"GitHub","url":"https://github.com/Tudat/tudat/issues/224"},"label":"DOCUMENTATION","_input_hash":-1547340774,"_task_hash":177361406,"answer":"reject"} {"text":"node.exe via WSL fails with EINVAL on uv_pipe_open","meta":{"source":"GitHub","url":"https://github.com/Microsoft/BashOnWindows/issues/2370"},"label":"DOCUMENTATION","_input_hash":193143769,"_task_hash":1251083294,"answer":"reject"} {"text":"Error establishing a database connection","meta":{"source":"GitHub","url":"https://github.com/drlogout/wordpress-duplicator/issues/2"},"label":"DOCUMENTATION","_input_hash":1843446970,"_task_hash":-676457483,"answer":"reject"} {"text":"Swift proto file extension values empty, data showing up in unknown storage","meta":{"source":"GitHub","url":"https://github.com/apple/swift-protobuf/issues/622"},"label":"DOCUMENTATION","_input_hash":-793562744,"_task_hash":-250728069,"answer":"reject"} {"text":"Model name has no instructions","meta":{"source":"GitHub","url":"https://github.com/juju/juju-gui/issues/3099"},"label":"DOCUMENTATION","_input_hash":1361160867,"_task_hash":1274065697,"answer":"reject"} {"text":"Saving empty object to a subschema does not do anything in newest mongoose","meta":{"source":"GitHub","url":"https://github.com/Automattic/mongoose/issues/5506"},"label":"DOCUMENTATION","_input_hash":-1273008511,"_task_hash":1257078500,"answer":"reject"} {"text":"Request - please add an option to read the movie ID at the front of the filename","meta":{"source":"GitHub","url":"https://github.com/DoctorD1501/JAVMovieScraper/issues/194"},"label":"DOCUMENTATION","_input_hash":-1797672568,"_task_hash":882660884,"answer":"reject"} {"text":"# Issue with k8s.io/docs/concepts/workloads/pods/pod-lifecycle/\n\n\r\n\r\n\r\n\r\nI am not sure if this is a bug. Please update documentation on conditionType for readinessGate.\r\n\r\n\r\n**Problem:**\r\nThe readiness gate explanation for conditionType is not clear as to what does conditionType mean, it just says that conditionType: \"www.example.com/feature-1\", is that a hard coded value, and how does this work.\r\n\r\nThe readiness gate requires one or two lines explanation on conditionType as to how the value can be used or changed.\r\n\r\n**Page to Update:**\r\nhttps://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle/\r\n\r\n\r\n\r\n1.14\r\n\r\n","title":"Issue with k8s.io/docs/concepts/workloads/pods/pod-lifecycle/","body":"\r\n\r\n\r\n\r\nI am not sure if this is a bug. Please update documentation on conditionType for readinessGate.\r\n\r\n\r\n**Problem:**\r\nThe readiness gate explanation for conditionType is not clear as to what does conditionType mean, it just says that conditionType: \"www.example.com/feature-1\", is that a hard coded value, and how does this work.\r\n\r\nThe readiness gate requires one or two lines explanation on conditionType as to how the value can be used or changed.\r\n\r\n**Page to Update:**\r\nhttps://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle/\r\n\r\n\r\n\r\n1.14\r\n\r\n","html":"

Issue with k8s.io/docs/concepts/workloads/pods/pod-lifecycle/

\n\n

\n\nI am not sure if this is a bug. Please update documentation on conditionType for readinessGate.

\n\n

\nProblem:\nThe readiness gate explanation for conditionType is not clear as to what does conditionType mean, it just says that conditionType: \"www.example.com/feature-1\", is that a hard coded value, and how does this work.

\n\n

The readiness gate requires one or two lines explanation on conditionType as to how the value can be used or changed.

\n\n

Page to Update:\nhttps://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle/

\n\n

\n\n1.14\n

\n","meta":{"source":"GitHub","url":"https://github.com/kubernetes/website/issues/15797"},"_input_hash":-770062058,"_task_hash":-1541096236,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"automate location translation file updates","meta":{"source":"GitHub","url":"https://github.com/CARLI/vufind/issues/241"},"label":"DOCUMENTATION","_input_hash":1877654670,"_task_hash":647477559,"answer":"reject"} {"text":"# (appveyor-server) Update documentation\n\nAs requested in https://github.com/appveyor/website/pull/645 I have created this issue to update the documentation and title for `appveyor-server`","title":"(appveyor-server) Update documentation","body":"As requested in https://github.com/appveyor/website/pull/645 I have created this issue to update the documentation and title for `appveyor-server`","html":"

(appveyor-server) Update documentation

\n\n

As requested in https://github.com/appveyor/website/pull/645 I have created this issue to update the documentation and title for appveyor-server

\n","meta":{"source":"GitHub","url":"https://github.com/mkevenaar/chocolatey-packages/issues/27"},"_input_hash":1346851678,"_task_hash":-1815745975,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Triggers and actions not working ","meta":{"source":"GitHub","url":"https://github.com/3Blades/3blades/issues/106"},"label":"DOCUMENTATION","_input_hash":744528473,"_task_hash":119395370,"answer":"reject"} {"text":"Trying to add own leds that are connected","meta":{"source":"GitHub","url":"https://github.com/mariusmotea/diyHue/issues/28"},"label":"DOCUMENTATION","_input_hash":717216390,"_task_hash":1271956319,"answer":"reject"} {"text":"date format in pluck() for collection","meta":{"source":"GitHub","url":"https://github.com/tightenco/jigsaw/issues/127"},"label":"DOCUMENTATION","_input_hash":-1070661473,"_task_hash":848420477,"answer":"reject"} {"text":"Add -ResourceId parameter for all Get-AzureRm* commands","meta":{"source":"GitHub","url":"https://github.com/Azure/azure-powershell/issues/4366"},"label":"DOCUMENTATION","_input_hash":122219843,"_task_hash":-325348631,"answer":"reject"} {"text":"# Game does not compile on Linux out of the box for several reasons\n\nFirst of all, you hardcode \"g++-5\" in game/sdl/linux/makefile\r\nTo continue compilation I had to remove -5 from several places on top of the make file so that executable is just \"g++\".\r\n\r\nThen build has stopped because of missing #include in several files:\r\ngame/httppackinfomanager.cpp with error: 'fopen' was not declared in this scope \r\ngame/httppackmanager.cpp:54:51: error: 'fopen' was not declared in this scope\r\nmpshared/packinfomanager.cpp:43:54: error: 'fopen' was not declared in this scope\r\nmpshared/indexloader.cpp:12:38: error: 'fopen' was not declared in this scope\r\n\r\nNext problem was:\r\n\r\n../../gameform.cpp:4:30: fatal error: game/Multiplayer.h: No such file or directory\r\ncompilation terminated.\r\n../../Multiplayer.cpp:2:30: fatal error: game/Multiplayer.h: No such file or directory\r\ncompilation terminated.\r\nIn fact this file does not exist, however file game/multiplayer.h does. So changing this two files to multiplayer.h did work.\r\n\r\nAfter those five files the game compile successfully.\r\n\r\nAnother problem is with file game/sdl/linux/install.sh\r\nIt uses ``if ! dpkg -l | grep libsdl2-dev > /dev/null ; then``\r\nThis line which would only work in distributions based on deb packages, such as Debian and Ubuntu but not in other distributions. I think it's better to say that SDL2 is required for building and recommend to install this two packages: libsdl2-dev libcurl4-openssl-dev in your readme file\r\n\r\n(though in my distribution -dev packages are not separated from main packages)","title":"Game does not compile on Linux out of the box for several reasons","body":"First of all, you hardcode \"g++-5\" in game/sdl/linux/makefile\r\nTo continue compilation I had to remove -5 from several places on top of the make file so that executable is just \"g++\".\r\n\r\nThen build has stopped because of missing #include in several files:\r\ngame/httppackinfomanager.cpp with error: 'fopen' was not declared in this scope \r\ngame/httppackmanager.cpp:54:51: error: 'fopen' was not declared in this scope\r\nmpshared/packinfomanager.cpp:43:54: error: 'fopen' was not declared in this scope\r\nmpshared/indexloader.cpp:12:38: error: 'fopen' was not declared in this scope\r\n\r\nNext problem was:\r\n\r\n../../gameform.cpp:4:30: fatal error: game/Multiplayer.h: No such file or directory\r\ncompilation terminated.\r\n../../Multiplayer.cpp:2:30: fatal error: game/Multiplayer.h: No such file or directory\r\ncompilation terminated.\r\nIn fact this file does not exist, however file game/multiplayer.h does. So changing this two files to multiplayer.h did work.\r\n\r\nAfter those five files the game compile successfully.\r\n\r\nAnother problem is with file game/sdl/linux/install.sh\r\nIt uses ``if ! dpkg -l | grep libsdl2-dev > /dev/null ; then``\r\nThis line which would only work in distributions based on deb packages, such as Debian and Ubuntu but not in other distributions. I think it's better to say that SDL2 is required for building and recommend to install this two packages: libsdl2-dev libcurl4-openssl-dev in your readme file\r\n\r\n(though in my distribution -dev packages are not separated from main packages)","html":"

Game does not compile on Linux out of the box for several reasons

\n\n

First of all, you hardcode \"g++-5\" in game/sdl/linux/makefile\nTo continue compilation I had to remove -5 from several places on top of the make file so that executable is just \"g++\".

\n\n

Then build has stopped because of missing #include in several files:\ngame/httppackinfomanager.cpp with error: 'fopen' was not declared in this scope \ngame/httppackmanager.cpp:54:51: error: 'fopen' was not declared in this scope\nmpshared/packinfomanager.cpp:43:54: error: 'fopen' was not declared in this scope\nmpshared/indexloader.cpp:12:38: error: 'fopen' was not declared in this scope

\n\n

Next problem was:

\n\n

../../gameform.cpp:4:30: fatal error: game/Multiplayer.h: No such file or directory\ncompilation terminated.\n../../Multiplayer.cpp:2:30: fatal error: game/Multiplayer.h: No such file or directory\ncompilation terminated.\nIn fact this file does not exist, however file game/multiplayer.h does. So changing this two files to multiplayer.h did work.

\n\n

After those five files the game compile successfully.

\n\n

Another problem is with file game/sdl/linux/install.sh\nIt uses if ! dpkg -l | grep libsdl2-dev > /dev/null ; then\nThis line which would only work in distributions based on deb packages, such as Debian and Ubuntu but not in other distributions. I think it's better to say that SDL2 is required for building and recommend to install this two packages: libsdl2-dev libcurl4-openssl-dev in your readme file

\n\n

(though in my distribution -dev packages are not separated from main packages)

\n","meta":{"source":"GitHub","url":"https://github.com/spiffcode/hostile-takeover/issues/4"},"_input_hash":1202045005,"_task_hash":730119548,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# \u0421\u0434\u0435\u043b\u0430\u043b \u0432\u0441\u0451 \u043f\u043e\u0448\u0430\u0433\u043e\u0432\u043e, \u043d\u043e \u0432 /movies - \u043e\u0448\u0438\u0431\u043a\u0430 404\n\n\u0423\u0436\u0435 2 \u0440\u0430\u0437 \u043f\u0435\u0440\u0435\u0441\u043e\u0437\u0434\u0430\u044e \u043f\u0440\u043e\u0435\u043a\u0442, \u0432\u044b\u043f\u043e\u043b\u043d\u0438\u043b \u0432\u0441\u0435 \u0443\u043a\u0430\u0437\u0430\u043d\u043d\u044b\u0435 \u043a\u043e\u043c\u0430\u043d\u0434\u044b ( \u0441 \u043c\u0438\u0433\u0440\u0430\u0446\u0438\u0435\u0439 \u0438 \u0430\u043f\u0434\u0435\u0439\u0442\u043e\u043c), \u043d\u043e /movies \u0440\u0430\u0437\u0434\u0435\u043b\u0430 \u043d\u0435 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u0443\u0435\u0442. \r\n\r\n---\r\n#### \u0421\u0432\u0435\u0434\u0435\u043d\u0438\u044f \u043e \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0435\r\n\r\n\u26a0 *\u041d\u0435 \u0432\u043d\u043e\u0441\u0438\u0442\u0435 \u043f\u0440\u0430\u0432\u043a\u0438 \u0432 \u044d\u0442\u043e\u0442 \u0440\u0430\u0437\u0434\u0435\u043b. \u042d\u0442\u043e \u043d\u0435\u043e\u0431\u0445\u043e\u0434\u0438\u043c\u043e \u0434\u043b\u044f \u0441\u0432\u044f\u0437\u044b\u0432\u0430\u043d\u0438\u044f \u0441\u0442\u0440\u0430\u043d\u0438\u0446\u044b \u0441\u0430\u0439\u0442\u0430 docs.microsoft.com \u0441 \u0432\u043e\u043f\u0440\u043e\u0441\u043e\u043c \u043d\u0430 GitHub.*\r\n\r\n* ID: 6719f08e-3bd7-dc1a-71df-f2ef9fbca9d8\r\n* Version Independent ID: 7096fdb3-612e-9e00-bd0b-8ea4886a09ce\r\n* Content: [\u0414\u043e\u0431\u0430\u0432\u043b\u0435\u043d\u0438\u0435 \u043c\u043e\u0434\u0435\u043b\u0438 \u0432 \u043f\u0440\u0438\u043b\u043e\u0436\u0435\u043d\u0438\u0435 Razor Pages \u0432 ASP.NET Core](https://docs.microsoft.com/ru-ru/aspnet/core/tutorials/razor-pages/model?view=aspnetcore-2.2&tabs=visual-studio#feedback)\r\n* Content Source: [aspnetcore/tutorials/razor-pages/model.md](https://github.com/aspnet/AspNetCore.Docs.ru-ru/blob/live/aspnetcore/tutorials/razor-pages/model.md)\r\n* Product: **aspnet-core**\r\n* Technology: **aspnetcore-tutorials**\r\n* GitHub Login: @Rick-Anderson\r\n* Microsoft Alias: **riande**","title":"\u0421\u0434\u0435\u043b\u0430\u043b \u0432\u0441\u0451 \u043f\u043e\u0448\u0430\u0433\u043e\u0432\u043e, \u043d\u043e \u0432 /movies - \u043e\u0448\u0438\u0431\u043a\u0430 404","body":"\u0423\u0436\u0435 2 \u0440\u0430\u0437 \u043f\u0435\u0440\u0435\u0441\u043e\u0437\u0434\u0430\u044e \u043f\u0440\u043e\u0435\u043a\u0442, \u0432\u044b\u043f\u043e\u043b\u043d\u0438\u043b \u0432\u0441\u0435 \u0443\u043a\u0430\u0437\u0430\u043d\u043d\u044b\u0435 \u043a\u043e\u043c\u0430\u043d\u0434\u044b ( \u0441 \u043c\u0438\u0433\u0440\u0430\u0446\u0438\u0435\u0439 \u0438 \u0430\u043f\u0434\u0435\u0439\u0442\u043e\u043c), \u043d\u043e /movies \u0440\u0430\u0437\u0434\u0435\u043b\u0430 \u043d\u0435 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u0443\u0435\u0442. \r\n\r\n---\r\n#### \u0421\u0432\u0435\u0434\u0435\u043d\u0438\u044f \u043e \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0435\r\n\r\n\u26a0 *\u041d\u0435 \u0432\u043d\u043e\u0441\u0438\u0442\u0435 \u043f\u0440\u0430\u0432\u043a\u0438 \u0432 \u044d\u0442\u043e\u0442 \u0440\u0430\u0437\u0434\u0435\u043b. \u042d\u0442\u043e \u043d\u0435\u043e\u0431\u0445\u043e\u0434\u0438\u043c\u043e \u0434\u043b\u044f \u0441\u0432\u044f\u0437\u044b\u0432\u0430\u043d\u0438\u044f \u0441\u0442\u0440\u0430\u043d\u0438\u0446\u044b \u0441\u0430\u0439\u0442\u0430 docs.microsoft.com \u0441 \u0432\u043e\u043f\u0440\u043e\u0441\u043e\u043c \u043d\u0430 GitHub.*\r\n\r\n* ID: 6719f08e-3bd7-dc1a-71df-f2ef9fbca9d8\r\n* Version Independent ID: 7096fdb3-612e-9e00-bd0b-8ea4886a09ce\r\n* Content: [\u0414\u043e\u0431\u0430\u0432\u043b\u0435\u043d\u0438\u0435 \u043c\u043e\u0434\u0435\u043b\u0438 \u0432 \u043f\u0440\u0438\u043b\u043e\u0436\u0435\u043d\u0438\u0435 Razor Pages \u0432 ASP.NET Core](https://docs.microsoft.com/ru-ru/aspnet/core/tutorials/razor-pages/model?view=aspnetcore-2.2&tabs=visual-studio#feedback)\r\n* Content Source: [aspnetcore/tutorials/razor-pages/model.md](https://github.com/aspnet/AspNetCore.Docs.ru-ru/blob/live/aspnetcore/tutorials/razor-pages/model.md)\r\n* Product: **aspnet-core**\r\n* Technology: **aspnetcore-tutorials**\r\n* GitHub Login: @Rick-Anderson\r\n* Microsoft Alias: **riande**","html":"

\u0421\u0434\u0435\u043b\u0430\u043b \u0432\u0441\u0451 \u043f\u043e\u0448\u0430\u0433\u043e\u0432\u043e, \u043d\u043e \u0432 /movies - \u043e\u0448\u0438\u0431\u043a\u0430 404

\n\n

\u0423\u0436\u0435 2 \u0440\u0430\u0437 \u043f\u0435\u0440\u0435\u0441\u043e\u0437\u0434\u0430\u044e \u043f\u0440\u043e\u0435\u043a\u0442, \u0432\u044b\u043f\u043e\u043b\u043d\u0438\u043b \u0432\u0441\u0435 \u0443\u043a\u0430\u0437\u0430\u043d\u043d\u044b\u0435 \u043a\u043e\u043c\u0430\u043d\u0434\u044b ( \u0441 \u043c\u0438\u0433\u0440\u0430\u0446\u0438\u0435\u0439 \u0438 \u0430\u043f\u0434\u0435\u0439\u0442\u043e\u043c), \u043d\u043e /movies \u0440\u0430\u0437\u0434\u0435\u043b\u0430 \u043d\u0435 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u0443\u0435\u0442.

\n\n
\n\n

\u0421\u0432\u0435\u0434\u0435\u043d\u0438\u044f \u043e \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0435

\n\n

\u26a0 \u041d\u0435 \u0432\u043d\u043e\u0441\u0438\u0442\u0435 \u043f\u0440\u0430\u0432\u043a\u0438 \u0432 \u044d\u0442\u043e\u0442 \u0440\u0430\u0437\u0434\u0435\u043b. \u042d\u0442\u043e \u043d\u0435\u043e\u0431\u0445\u043e\u0434\u0438\u043c\u043e \u0434\u043b\u044f \u0441\u0432\u044f\u0437\u044b\u0432\u0430\u043d\u0438\u044f \u0441\u0442\u0440\u0430\u043d\u0438\u0446\u044b \u0441\u0430\u0439\u0442\u0430 docs.microsoft.com \u0441 \u0432\u043e\u043f\u0440\u043e\u0441\u043e\u043c \u043d\u0430 GitHub.

\n\n\n","meta":{"source":"GitHub","url":"https://github.com/aspnet/AspNetCore.Docs.ru-ru/issues/55"},"_input_hash":-517472946,"_task_hash":-1848268567,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Custom Easing Logic","meta":{"source":"GitHub","url":"https://github.com/nicky-lenaers/ngx-scroll-to/issues/19"},"label":"DOCUMENTATION","_input_hash":1681282057,"_task_hash":53133300,"answer":"reject"} {"text":"[Docs] How to create custom components?","meta":{"source":"GitHub","url":"https://github.com/Semantic-Org/Semantic-UI/issues/5592"},"label":"DOCUMENTATION","_input_hash":1438123125,"_task_hash":2135359885,"answer":"accept"} {"text":"Update Sysctls Docs","meta":{"source":"GitHub","url":"https://github.com/kubernetes/kubernetes.github.io/issues/4505"},"label":"DOCUMENTATION","_input_hash":-669889638,"_task_hash":-470210411,"answer":"accept"} {"text":"Multidimensional slicing doesn't work as shown","meta":{"source":"GitHub","url":"https://github.com/zackchase/mxnet-the-straight-dope/issues/28"},"label":"DOCUMENTATION","_input_hash":-2141105781,"_task_hash":-1848387253,"answer":"reject"} {"text":"Azure deploy Error: Cannot find module 'user-home'","meta":{"source":"GitHub","url":"https://github.com/nightscout/cgm-remote-monitor/issues/2709"},"label":"DOCUMENTATION","_input_hash":-1789494024,"_task_hash":-294911192,"answer":"reject"} {"text":"Fix: info project in README","meta":{"source":"GitHub","url":"https://github.com/labpositiva/ansible-role-monit/issues/7"},"label":"DOCUMENTATION","_input_hash":-375838941,"_task_hash":960453498,"answer":"accept"} {"text":"Error deploying to Dokku","meta":{"source":"GitHub","url":"https://github.com/Twilio-org/rapid-response-kit/issues/20"},"label":"DOCUMENTATION","_input_hash":1275183832,"_task_hash":1151819020,"answer":"accept"} {"text":"# How to connect to a ES cluster that is TLS/authentication enable?\n\nI can't seem to find any documentation on how to leverage the jaeger-operator to create an instance using a secured ES cluster that is TLS and authentication enabled. Can someone provide some guidance on how to get this to work? This is what I have right now. Furthermore, the cleaner job will also need to be able to authenticate. The URL for ES is https://jaeger-es-http:9200 using a self-signed cert.\r\n\r\n```\r\napiVersion: jaegertracing.io/v1\r\nkind: Jaeger\r\nmetadata:\r\n name: jaeger\r\nspec:\r\n strategy: production\r\n storage:\r\n type: elasticsearch\r\n options:\r\n es:\r\n server-urls: http://jaeger-es-http:9200\r\n esIndexCleaner:\r\n enabled: true\r\n numberOfDays: 30\r\n schedule: \"55 23 * * *\"\r\n image: jaegertracing/jaeger-es-index-cleaner\r\n agent:\r\n strategy: DaemonSet\r\n sampling:\r\n options:\r\n default_strategy:\r\n type: const\r\n param: 1\r\n```","title":"How to connect to a ES cluster that is TLS/authentication enable?","body":"I can't seem to find any documentation on how to leverage the jaeger-operator to create an instance using a secured ES cluster that is TLS and authentication enabled. Can someone provide some guidance on how to get this to work? This is what I have right now. Furthermore, the cleaner job will also need to be able to authenticate. The URL for ES is https://jaeger-es-http:9200 using a self-signed cert.\r\n\r\n```\r\napiVersion: jaegertracing.io/v1\r\nkind: Jaeger\r\nmetadata:\r\n name: jaeger\r\nspec:\r\n strategy: production\r\n storage:\r\n type: elasticsearch\r\n options:\r\n es:\r\n server-urls: http://jaeger-es-http:9200\r\n esIndexCleaner:\r\n enabled: true\r\n numberOfDays: 30\r\n schedule: \"55 23 * * *\"\r\n image: jaegertracing/jaeger-es-index-cleaner\r\n agent:\r\n strategy: DaemonSet\r\n sampling:\r\n options:\r\n default_strategy:\r\n type: const\r\n param: 1\r\n```","html":"

How to connect to a ES cluster that is TLS/authentication enable?

\n\n

I can't seem to find any documentation on how to leverage the jaeger-operator to create an instance using a secured ES cluster that is TLS and authentication enabled. Can someone provide some guidance on how to get this to work? This is what I have right now. Furthermore, the cleaner job will also need to be able to authenticate. The URL for ES is https://jaeger-es-http:9200 using a self-signed cert.

\n\n

\napiVersion: jaegertracing.io/v1\nkind: Jaeger\nmetadata:\n name: jaeger\nspec:\n strategy: production\n storage:\n type: elasticsearch\n options:\n es:\n server-urls: http://jaeger-es-http:9200\n esIndexCleaner:\n enabled: true\n numberOfDays: 30\n schedule: \"55 23 * * *\"\n image: jaegertracing/jaeger-es-index-cleaner\n agent:\n strategy: DaemonSet\n sampling:\n options:\n default_strategy:\n type: const\n param: 1\n

\n","meta":{"source":"GitHub","url":"https://github.com/jaegertracing/jaeger-operator/issues/591"},"_input_hash":662935040,"_task_hash":-65017714,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Compiling Problem to IOS","meta":{"source":"GitHub","url":"https://github.com/myflashlab/Firebase-ANE/issues/129"},"label":"DOCUMENTATION","_input_hash":-2144871296,"_task_hash":-771717021,"answer":"reject"} {"text":"Minimum DB User Privileges to run app","meta":{"source":"GitHub","url":"https://github.com/BookStackApp/BookStack/issues/451"},"label":"DOCUMENTATION","_input_hash":371781005,"_task_hash":1897161393,"answer":"reject"} {"text":"GetInstanceAttributeListService","meta":{"source":"GitHub","url":"https://github.com/digitalpetri/ethernet-ip/issues/11"},"label":"DOCUMENTATION","_input_hash":-178801717,"_task_hash":1051821100,"answer":"reject"} {"text":"Serps\\SearchEngine\\Google\\Exception\\InvalidDOMException","meta":{"source":"GitHub","url":"https://github.com/serp-spider/search-engine-google/issues/74"},"label":"DOCUMENTATION","_input_hash":-269864236,"_task_hash":2862800,"answer":"reject"} {"text":"# yolov3-tiny compatibility \n\nI am trying to use the yolov3-tiny weights and config files to test performance. I keep encountering a segmentation fault and no other error to be seen. I followed the example code from the pip documentation:\r\n\r\n```\r\nfrom pydarknet import Detector, Image\r\nimport cv2\r\n\r\nnet = Detector(bytes(\"cfg/yolov3-tiny.cfg\", encoding=\"utf-8\"), bytes(\"weights/yolov3-tiny.weights\", encoding=\"utf-8\"), 0, bytes(\"cfg/coco.data\",encoding=\"utf-8\"))\r\n\r\nimg = cv2.imread('humans.jpg')\r\nimg_darknet = Image(img)\r\n\r\nresults = net.detect(img_darknet)\r\n\r\nfor cat, score, bounds in results:\r\n x, y, w, h = bounds\r\n cv2.rectangle(img, (int(x - w / 2), int(y - h / 2)), (int(x + w / 2), int(y + h / 2)), (255, 0, 0), thickness=2)\r\n cv2.putText(img,str(cat.decode(\"utf-8\")),(int(x),int(y)),cv2.FONT_HERSHEY_COMPLEX,1,(255,255,0))\r\n\r\ncv2.imshow(\"output\", img)\r\ncv2.waitKey(0)\r\ncv2.destroyAllWindows();\r\n\r\n```","title":"yolov3-tiny compatibility ","body":"I am trying to use the yolov3-tiny weights and config files to test performance. I keep encountering a segmentation fault and no other error to be seen. I followed the example code from the pip documentation:\r\n\r\n```\r\nfrom pydarknet import Detector, Image\r\nimport cv2\r\n\r\nnet = Detector(bytes(\"cfg/yolov3-tiny.cfg\", encoding=\"utf-8\"), bytes(\"weights/yolov3-tiny.weights\", encoding=\"utf-8\"), 0, bytes(\"cfg/coco.data\",encoding=\"utf-8\"))\r\n\r\nimg = cv2.imread('humans.jpg')\r\nimg_darknet = Image(img)\r\n\r\nresults = net.detect(img_darknet)\r\n\r\nfor cat, score, bounds in results:\r\n x, y, w, h = bounds\r\n cv2.rectangle(img, (int(x - w / 2), int(y - h / 2)), (int(x + w / 2), int(y + h / 2)), (255, 0, 0), thickness=2)\r\n cv2.putText(img,str(cat.decode(\"utf-8\")),(int(x),int(y)),cv2.FONT_HERSHEY_COMPLEX,1,(255,255,0))\r\n\r\ncv2.imshow(\"output\", img)\r\ncv2.waitKey(0)\r\ncv2.destroyAllWindows();\r\n\r\n```","html":"

yolov3-tiny compatibility

\n\n

I am trying to use the yolov3-tiny weights and config files to test performance. I keep encountering a segmentation fault and no other error to be seen. I followed the example code from the pip documentation:

\n\n

```\nfrom pydarknet import Detector, Image\nimport cv2

\n\n

net = Detector(bytes(\"cfg/yolov3-tiny.cfg\", encoding=\"utf-8\"), bytes(\"weights/yolov3-tiny.weights\", encoding=\"utf-8\"), 0, bytes(\"cfg/coco.data\",encoding=\"utf-8\"))

\n\n

img = cv2.imread('humans.jpg')\nimg_darknet = Image(img)

\n\n

results = net.detect(img_darknet)

\n\n

for cat, score, bounds in results:\n x, y, w, h = bounds\n cv2.rectangle(img, (int(x - w / 2), int(y - h / 2)), (int(x + w / 2), int(y + h / 2)), (255, 0, 0), thickness=2)\n cv2.putText(img,str(cat.decode(\"utf-8\")),(int(x),int(y)),cv2.FONTHERSHEYCOMPLEX,1,(255,255,0))

\n\n

cv2.imshow(\"output\", img)\ncv2.waitKey(0)\ncv2.destroyAllWindows();

\n\n

```

\n","meta":{"source":"GitHub","url":"https://github.com/madhawav/YOLO3-4-Py/issues/95"},"_input_hash":-35444084,"_task_hash":-488821196,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Breakup class to better map to Gitlabs documentation","meta":{"source":"GitHub","url":"https://github.com/pyapi-gitlab/pyapi-gitlab/issues/243"},"label":"DOCUMENTATION","_input_hash":-1504953441,"_task_hash":-570922570,"answer":"accept"} {"text":"GroupProcedure DidFinish observer is not guaranteed to execute before child DidFinish observers","meta":{"source":"GitHub","url":"https://github.com/ProcedureKit/ProcedureKit/issues/778"},"label":"DOCUMENTATION","_input_hash":1464601535,"_task_hash":-1114838771,"answer":"reject"} {"text":"Where to Run the swarm_setup.sh Script","meta":{"source":"GitHub","url":"https://github.com/sendwyre/swarm-mode/issues/1"},"label":"DOCUMENTATION","_input_hash":1684057973,"_task_hash":2050328448,"answer":"reject"} {"text":"https://github.com/daisykd/hello-world/blob/master/README.md","meta":{"source":"GitHub","url":"https://github.com/daisykd/hello-world/issues/2"},"label":"DOCUMENTATION","_input_hash":1675277129,"_task_hash":-1257100098,"answer":"accept"} {"text":"How to run tests","meta":{"source":"GitHub","url":"https://github.com/konstantinstadler/country_converter/issues/11"},"label":"DOCUMENTATION","_input_hash":327554853,"_task_hash":-856470089,"answer":"reject"} {"text":"# UPDATE CRONTAB\n\nI am running a process via crontab once a day , \r\nnow I want to run another process once a day , \r\n\r\nhow do i update my crontab on dokku ?\r\n\r\nroot@AmzBotD:~# dokku run test1 crontab -l\r\nno matching process entry found\r\nno crontab for herokuishuser\r\n\r\n","title":"UPDATE CRONTAB","body":"I am running a process via crontab once a day , \r\nnow I want to run another process once a day , \r\n\r\nhow do i update my crontab on dokku ?\r\n\r\nroot@AmzBotD:~# dokku run test1 crontab -l\r\nno matching process entry found\r\nno crontab for herokuishuser\r\n\r\n","html":"

UPDATE CRONTAB

\n\n

I am running a process via crontab once a day , \nnow I want to run another process once a day ,

\n\n

how do i update my crontab on dokku ?

\n\n

root@AmzBotD:~# dokku run test1 crontab -l\nno matching process entry found\nno crontab for herokuishuser

\n","meta":{"source":"GitHub","url":"https://github.com/dokku/dokku/issues/3638"},"_input_hash":663336359,"_task_hash":-296189309,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"dapp not reachable when deploying with docker behind nginx","meta":{"source":"GitHub","url":"https://github.com/paritytech/parity/issues/6154"},"label":"DOCUMENTATION","_input_hash":-189953675,"_task_hash":364639639,"answer":"reject"} {"text":"# Implement an interface for all errors, so they can be redefined by callers\n\nCurrently we return some error messages to end users, like\r\n\r\n- \"flag provided but not defined\"\r\n- \"required flag not set\"\r\n\r\nBut do not provide a way for people to re-define those error messages. We should implement some public interfaces for our errors, and provide documentation on how to implement custom error messages.\r\n\r\nRelated issues / PRs\r\n\r\n- https://github.com/urfave/cli/issues/852\r\n- https://github.com/urfave/cli/pull/656","title":"Implement an interface for all errors, so they can be redefined by callers","body":"Currently we return some error messages to end users, like\r\n\r\n- \"flag provided but not defined\"\r\n- \"required flag not set\"\r\n\r\nBut do not provide a way for people to re-define those error messages. We should implement some public interfaces for our errors, and provide documentation on how to implement custom error messages.\r\n\r\nRelated issues / PRs\r\n\r\n- https://github.com/urfave/cli/issues/852\r\n- https://github.com/urfave/cli/pull/656","html":"

Implement an interface for all errors, so they can be redefined by callers

\n\n

Currently we return some error messages to end users, like

\n\n
    \n
  • \"flag provided but not defined\"
  • \n
  • \"required flag not set\"
  • \n
\n\n

But do not provide a way for people to re-define those error messages. We should implement some public interfaces for our errors, and provide documentation on how to implement custom error messages.

\n\n

Related issues / PRs

\n\n
    \n
  • https://github.com/urfave/cli/issues/852
  • \n
  • https://github.com/urfave/cli/pull/656
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/urfave/cli/issues/853"},"_input_hash":-921080296,"_task_hash":-1722741088,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Updated Definition","meta":{"source":"GitHub","url":"https://github.com/DataInteroperability/xapi-profiles/issues/220"},"label":"DOCUMENTATION","_input_hash":954069212,"_task_hash":1020622290,"answer":"accept"} {"text":"# Weekly Digest (4 August, 2019 - 11 August, 2019)\n\nHere's the **Weekly Digest** for [*veggiemonk/awesome-docker*](https://github.com/veggiemonk/awesome-docker):\n\n - - - \n# ISSUES\nLast week 2 issues were created.\nOf these, 2 issues have been closed and 0 issues are still open.\n## CLOSED ISSUES\n:heart: #729 [fix Travis CI Build #2400](https://github.com/veggiemonk/awesome-docker/pull/729), by [agebhar1](https://github.com/agebhar1)\n:heart: #728 [Fix a typo](https://github.com/veggiemonk/awesome-docker/pull/728), by [gokaygurcan](https://github.com/gokaygurcan)\n## NOISY ISSUE\n:speaker: #728 [Fix a typo](https://github.com/veggiemonk/awesome-docker/pull/728), by [gokaygurcan](https://github.com/gokaygurcan)\nIt received 2 comments.\n\n - - - \n# PULL REQUESTS\nLast week, 2 pull requests were created, updated or merged.\n## MERGED PULL REQUEST\nLast week, 2 pull requests were merged.\n:purple_heart: #729 [fix Travis CI Build #2400](https://github.com/veggiemonk/awesome-docker/pull/729), by [agebhar1](https://github.com/agebhar1)\n:purple_heart: #728 [Fix a typo](https://github.com/veggiemonk/awesome-docker/pull/728), by [gokaygurcan](https://github.com/gokaygurcan)\n\n - - - \n# COMMITS\nLast week there were 9 commits.\n:hammer_and_wrench: [Automated update repository metadata [skip-ci]](https://github.com/veggiemonk/awesome-docker/commit/a6fca13c53bec4aecb0f335ad2b7ea1961e46cda) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [Automated update repository metadata [skip-ci]](https://github.com/veggiemonk/awesome-docker/commit/9b4ed4060f548d16090fe2bce2ca72bd393847cf) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [Merge pull request #729 from agebhar1/feature/fix-TravisCI#2400 fix Travis CI Build #2400](https://github.com/veggiemonk/awesome-docker/commit/7c97cd24c3c8c301b661e3109ba78bcde97f3473) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [remove 'docker-fluentd' since it's not available anymore TravisCI [#2400]: > Issues :-( > > > Links > 1. [L185] 404 https://github.com/kiyoto/docker-fluentd > > Dupes > None \u2713 [#2400] https://travis-ci.org/veggiemonk/awesome-docker/builds/569439029](https://github.com/veggiemonk/awesome-docker/commit/3d43a68699d9da3b8d3c1c776cf11d7abfa34909) by [agebhar1](https://github.com/agebhar1)\n:hammer_and_wrench: [fix '4c0a16c Update README.md'](https://github.com/veggiemonk/awesome-docker/commit/aed14d621b97c53df9c27a1c74dda804fb089c27) by [agebhar1](https://github.com/agebhar1)\n:hammer_and_wrench: [Update README.md](https://github.com/veggiemonk/awesome-docker/commit/4c0a16cf5c29fc6cc309dbc7cbe85a30d6b2568f) by [gokaygurcan](https://github.com/gokaygurcan)\n:hammer_and_wrench: [Automated update repository metadata [skip-ci]](https://github.com/veggiemonk/awesome-docker/commit/45fd3df19eb58227bc8275f6bf6846092dc80d0b) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [Automated update repository metadata [skip-ci]](https://github.com/veggiemonk/awesome-docker/commit/7ce07beb6f94929c5dfdc448699adb6cea32a653) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [Fix mesosphere renamed d2iq](https://github.com/veggiemonk/awesome-docker/commit/70fe28b14adbebc35705b2f505cd66970384cffe) by [veggiemonk](https://github.com/veggiemonk)\n\n - - - \n# CONTRIBUTORS\nLast week there were 3 contributors.\n:bust_in_silhouette: [veggiemonk](https://github.com/veggiemonk)\n:bust_in_silhouette: [agebhar1](https://github.com/agebhar1)\n:bust_in_silhouette: [gokaygurcan](https://github.com/gokaygurcan)\n\n - - - \n# STARGAZERS\nLast week there were 67 stagazers.\n:star: [tbw-wb](https://github.com/tbw-wb)\n:star: [ayalcin1](https://github.com/ayalcin1)\n:star: [sssxie](https://github.com/sssxie)\n:star: [shimadama](https://github.com/shimadama)\n:star: [charstnut](https://github.com/charstnut)\n:star: [nienjiuntai](https://github.com/nienjiuntai)\n:star: [ashwamegh](https://github.com/ashwamegh)\n:star: [oshou](https://github.com/oshou)\n:star: [wlisrausr](https://github.com/wlisrausr)\n:star: [leaked](https://github.com/leaked)\n:star: [paraparity](https://github.com/paraparity)\n:star: [metalmandalore](https://github.com/metalmandalore)\n:star: [sebnapi](https://github.com/sebnapi)\n:star: [CompilerBian](https://github.com/CompilerBian)\n:star: [Kuri-su](https://github.com/Kuri-su)\n:star: [dionysisk](https://github.com/dionysisk)\n:star: [Raltay](https://github.com/Raltay)\n:star: [balloontmz](https://github.com/balloontmz)\n:star: [jairofloress](https://github.com/jairofloress)\n:star: [pseegaha](https://github.com/pseegaha)\n:star: [mvpvg](https://github.com/mvpvg)\n:star: [danielvelara](https://github.com/danielvelara)\n:star: [Fofade](https://github.com/Fofade)\n:star: [tjuyy](https://github.com/tjuyy)\n:star: [seagalputra](https://github.com/seagalputra)\n:star: [dskusuma](https://github.com/dskusuma)\n:star: [marcotinacci](https://github.com/marcotinacci)\n:star: [dsw0214](https://github.com/dsw0214)\n:star: [rafaelcalleja](https://github.com/rafaelcalleja)\n:star: [tkeitzl](https://github.com/tkeitzl)\n:star: [morkot](https://github.com/morkot)\n:star: [Dougs71](https://github.com/Dougs71)\n:star: [Nyadesune](https://github.com/Nyadesune)\n:star: [Stormiix](https://github.com/Stormiix)\n:star: [VahidAlizadeh](https://github.com/VahidAlizadeh)\n:star: [jamesvibar](https://github.com/jamesvibar)\n:star: [piharpi](https://github.com/piharpi)\n:star: [gabtub](https://github.com/gabtub)\n:star: [hex42](https://github.com/hex42)\n:star: [decryptus](https://github.com/decryptus)\n:star: [femicodes](https://github.com/femicodes)\n:star: [paulmillerp03](https://github.com/paulmillerp03)\n:star: [AndreLucasrs](https://github.com/AndreLucasrs)\n:star: [TyIsI](https://github.com/TyIsI)\n:star: [thenx](https://github.com/thenx)\n:star: [YanghangXu](https://github.com/YanghangXu)\n:star: [DCsunset](https://github.com/DCsunset)\n:star: [pablocrivella](https://github.com/pablocrivella)\n:star: [dangnguyen27](https://github.com/dangnguyen27)\n:star: [mmicome](https://github.com/mmicome)\n:star: [atriple](https://github.com/atriple)\n:star: [jocobtt](https://github.com/jocobtt)\n:star: [romap0](https://github.com/romap0)\n:star: [William0Friend](https://github.com/William0Friend)\n:star: [aimuch](https://github.com/aimuch)\n:star: [breeze924](https://github.com/breeze924)\n:star: [OnurSevket](https://github.com/OnurSevket)\n:star: [cnodin](https://github.com/cnodin)\n:star: [weisurya](https://github.com/weisurya)\n:star: [M3te0r](https://github.com/M3te0r)\n:star: [Celcis](https://github.com/Celcis)\n:star: [wesleimp](https://github.com/wesleimp)\n:star: [lakshmanpasala](https://github.com/lakshmanpasala)\n:star: [fedorovic82](https://github.com/fedorovic82)\n:star: [amjimenez](https://github.com/amjimenez)\n:star: [thiagotnunes](https://github.com/thiagotnunes)\n:star: [v-Muddu](https://github.com/v-Muddu)\nYou all are the stars! :star2:\n\n - - - \n# RELEASES\nLast week there were no releases.\n\n - - - \n\nThat's all for last week, please :eyes: **Watch** and :star: **Star** the repository [*veggiemonk/awesome-docker*](https://github.com/veggiemonk/awesome-docker) to receive next weekly updates. :smiley:\n\n*You can also [view all Weekly Digests by clicking here](https://github.com/veggiemonk/awesome-docker/issues?q=is:open+is:issue+label:weekly-digest).* \n\n> Your [**Weekly Digest**](https://github.com/apps/weekly-digest) bot. :calendar:\n","title":"Weekly Digest (4 August, 2019 - 11 August, 2019)","body":"Here's the **Weekly Digest** for [*veggiemonk/awesome-docker*](https://github.com/veggiemonk/awesome-docker):\n\n - - - \n# ISSUES\nLast week 2 issues were created.\nOf these, 2 issues have been closed and 0 issues are still open.\n## CLOSED ISSUES\n:heart: #729 [fix Travis CI Build #2400](https://github.com/veggiemonk/awesome-docker/pull/729), by [agebhar1](https://github.com/agebhar1)\n:heart: #728 [Fix a typo](https://github.com/veggiemonk/awesome-docker/pull/728), by [gokaygurcan](https://github.com/gokaygurcan)\n## NOISY ISSUE\n:speaker: #728 [Fix a typo](https://github.com/veggiemonk/awesome-docker/pull/728), by [gokaygurcan](https://github.com/gokaygurcan)\nIt received 2 comments.\n\n - - - \n# PULL REQUESTS\nLast week, 2 pull requests were created, updated or merged.\n## MERGED PULL REQUEST\nLast week, 2 pull requests were merged.\n:purple_heart: #729 [fix Travis CI Build #2400](https://github.com/veggiemonk/awesome-docker/pull/729), by [agebhar1](https://github.com/agebhar1)\n:purple_heart: #728 [Fix a typo](https://github.com/veggiemonk/awesome-docker/pull/728), by [gokaygurcan](https://github.com/gokaygurcan)\n\n - - - \n# COMMITS\nLast week there were 9 commits.\n:hammer_and_wrench: [Automated update repository metadata [skip-ci]](https://github.com/veggiemonk/awesome-docker/commit/a6fca13c53bec4aecb0f335ad2b7ea1961e46cda) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [Automated update repository metadata [skip-ci]](https://github.com/veggiemonk/awesome-docker/commit/9b4ed4060f548d16090fe2bce2ca72bd393847cf) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [Merge pull request #729 from agebhar1/feature/fix-TravisCI#2400 fix Travis CI Build #2400](https://github.com/veggiemonk/awesome-docker/commit/7c97cd24c3c8c301b661e3109ba78bcde97f3473) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [remove 'docker-fluentd' since it's not available anymore TravisCI [#2400]: > Issues :-( > > > Links > 1. [L185] 404 https://github.com/kiyoto/docker-fluentd > > Dupes > None \u2713 [#2400] https://travis-ci.org/veggiemonk/awesome-docker/builds/569439029](https://github.com/veggiemonk/awesome-docker/commit/3d43a68699d9da3b8d3c1c776cf11d7abfa34909) by [agebhar1](https://github.com/agebhar1)\n:hammer_and_wrench: [fix '4c0a16c Update README.md'](https://github.com/veggiemonk/awesome-docker/commit/aed14d621b97c53df9c27a1c74dda804fb089c27) by [agebhar1](https://github.com/agebhar1)\n:hammer_and_wrench: [Update README.md](https://github.com/veggiemonk/awesome-docker/commit/4c0a16cf5c29fc6cc309dbc7cbe85a30d6b2568f) by [gokaygurcan](https://github.com/gokaygurcan)\n:hammer_and_wrench: [Automated update repository metadata [skip-ci]](https://github.com/veggiemonk/awesome-docker/commit/45fd3df19eb58227bc8275f6bf6846092dc80d0b) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [Automated update repository metadata [skip-ci]](https://github.com/veggiemonk/awesome-docker/commit/7ce07beb6f94929c5dfdc448699adb6cea32a653) by [veggiemonk](https://github.com/veggiemonk)\n:hammer_and_wrench: [Fix mesosphere renamed d2iq](https://github.com/veggiemonk/awesome-docker/commit/70fe28b14adbebc35705b2f505cd66970384cffe) by [veggiemonk](https://github.com/veggiemonk)\n\n - - - \n# CONTRIBUTORS\nLast week there were 3 contributors.\n:bust_in_silhouette: [veggiemonk](https://github.com/veggiemonk)\n:bust_in_silhouette: [agebhar1](https://github.com/agebhar1)\n:bust_in_silhouette: [gokaygurcan](https://github.com/gokaygurcan)\n\n - - - \n# STARGAZERS\nLast week there were 67 stagazers.\n:star: [tbw-wb](https://github.com/tbw-wb)\n:star: [ayalcin1](https://github.com/ayalcin1)\n:star: [sssxie](https://github.com/sssxie)\n:star: [shimadama](https://github.com/shimadama)\n:star: [charstnut](https://github.com/charstnut)\n:star: [nienjiuntai](https://github.com/nienjiuntai)\n:star: [ashwamegh](https://github.com/ashwamegh)\n:star: [oshou](https://github.com/oshou)\n:star: [wlisrausr](https://github.com/wlisrausr)\n:star: [leaked](https://github.com/leaked)\n:star: [paraparity](https://github.com/paraparity)\n:star: [metalmandalore](https://github.com/metalmandalore)\n:star: [sebnapi](https://github.com/sebnapi)\n:star: [CompilerBian](https://github.com/CompilerBian)\n:star: [Kuri-su](https://github.com/Kuri-su)\n:star: [dionysisk](https://github.com/dionysisk)\n:star: [Raltay](https://github.com/Raltay)\n:star: [balloontmz](https://github.com/balloontmz)\n:star: [jairofloress](https://github.com/jairofloress)\n:star: [pseegaha](https://github.com/pseegaha)\n:star: [mvpvg](https://github.com/mvpvg)\n:star: [danielvelara](https://github.com/danielvelara)\n:star: [Fofade](https://github.com/Fofade)\n:star: [tjuyy](https://github.com/tjuyy)\n:star: [seagalputra](https://github.com/seagalputra)\n:star: [dskusuma](https://github.com/dskusuma)\n:star: [marcotinacci](https://github.com/marcotinacci)\n:star: [dsw0214](https://github.com/dsw0214)\n:star: [rafaelcalleja](https://github.com/rafaelcalleja)\n:star: [tkeitzl](https://github.com/tkeitzl)\n:star: [morkot](https://github.com/morkot)\n:star: [Dougs71](https://github.com/Dougs71)\n:star: [Nyadesune](https://github.com/Nyadesune)\n:star: [Stormiix](https://github.com/Stormiix)\n:star: [VahidAlizadeh](https://github.com/VahidAlizadeh)\n:star: [jamesvibar](https://github.com/jamesvibar)\n:star: [piharpi](https://github.com/piharpi)\n:star: [gabtub](https://github.com/gabtub)\n:star: [hex42](https://github.com/hex42)\n:star: [decryptus](https://github.com/decryptus)\n:star: [femicodes](https://github.com/femicodes)\n:star: [paulmillerp03](https://github.com/paulmillerp03)\n:star: [AndreLucasrs](https://github.com/AndreLucasrs)\n:star: [TyIsI](https://github.com/TyIsI)\n:star: [thenx](https://github.com/thenx)\n:star: [YanghangXu](https://github.com/YanghangXu)\n:star: [DCsunset](https://github.com/DCsunset)\n:star: [pablocrivella](https://github.com/pablocrivella)\n:star: [dangnguyen27](https://github.com/dangnguyen27)\n:star: [mmicome](https://github.com/mmicome)\n:star: [atriple](https://github.com/atriple)\n:star: [jocobtt](https://github.com/jocobtt)\n:star: [romap0](https://github.com/romap0)\n:star: [William0Friend](https://github.com/William0Friend)\n:star: [aimuch](https://github.com/aimuch)\n:star: [breeze924](https://github.com/breeze924)\n:star: [OnurSevket](https://github.com/OnurSevket)\n:star: [cnodin](https://github.com/cnodin)\n:star: [weisurya](https://github.com/weisurya)\n:star: [M3te0r](https://github.com/M3te0r)\n:star: [Celcis](https://github.com/Celcis)\n:star: [wesleimp](https://github.com/wesleimp)\n:star: [lakshmanpasala](https://github.com/lakshmanpasala)\n:star: [fedorovic82](https://github.com/fedorovic82)\n:star: [amjimenez](https://github.com/amjimenez)\n:star: [thiagotnunes](https://github.com/thiagotnunes)\n:star: [v-Muddu](https://github.com/v-Muddu)\nYou all are the stars! :star2:\n\n - - - \n# RELEASES\nLast week there were no releases.\n\n - - - \n\nThat's all for last week, please :eyes: **Watch** and :star: **Star** the repository [*veggiemonk/awesome-docker*](https://github.com/veggiemonk/awesome-docker) to receive next weekly updates. :smiley:\n\n*You can also [view all Weekly Digests by clicking here](https://github.com/veggiemonk/awesome-docker/issues?q=is:open+is:issue+label:weekly-digest).* \n\n> Your [**Weekly Digest**](https://github.com/apps/weekly-digest) bot. :calendar:\n","html":"

Weekly Digest (4 August, 2019 - 11 August, 2019)

\n\n

Here's the Weekly Digest for veggiemonk/awesome-docker:

\n\n
\n\n

ISSUES

\n\n

Last week 2 issues were created.\nOf these, 2 issues have been closed and 0 issues are still open.

\n\n

CLOSED ISSUES

\n\n

:heart: #729 fix Travis CI Build #2400, by agebhar1\n:heart: #728 Fix a typo, by gokaygurcan

\n\n

NOISY ISSUE

\n\n

:speaker: #728 Fix a typo, by gokaygurcan\nIt received 2 comments.

\n\n
\n\n

PULL REQUESTS

\n\n

Last week, 2 pull requests were created, updated or merged.

\n\n

MERGED PULL REQUEST

\n\n

Last week, 2 pull requests were merged.\n:purpleheart: #729 fix Travis CI Build #2400, by agebhar1\n:purpleheart: #728 Fix a typo, by gokaygurcan

\n\n
\n\n

COMMITS

\n\n

Last week there were 9 commits.\n:hammerandwrench: Automated update repository metadata [skip-ci] by veggiemonk\n:hammerandwrench: Automated update repository metadata [skip-ci] by veggiemonk\n:hammerandwrench: Merge pull request #729 from agebhar1/feature/fix-TravisCI#2400 fix Travis CI Build #2400 by veggiemonk\n:hammerandwrench: remove 'docker-fluentd' since it's not available anymore TravisCI [#2400]: > Issues :-( > > > Links > 1. [L185] 404 https://github.com/kiyoto/docker-fluentd > > Dupes > None \u2713 [#2400] https://travis-ci.org/veggiemonk/awesome-docker/builds/569439029 by agebhar1\n:hammerandwrench: fix '4c0a16c Update README.md' by agebhar1\n:hammerandwrench: Update README.md by gokaygurcan\n:hammerandwrench: Automated update repository metadata [skip-ci] by veggiemonk\n:hammerandwrench: Automated update repository metadata [skip-ci] by veggiemonk\n:hammerandwrench: Fix mesosphere renamed d2iq by veggiemonk

\n\n
\n\n

CONTRIBUTORS

\n\n

Last week there were 3 contributors.\n:bustinsilhouette: veggiemonk\n:bustinsilhouette: agebhar1\n:bustinsilhouette: gokaygurcan

\n\n
\n\n

STARGAZERS

\n\n

Last week there were 67 stagazers.\n:star: tbw-wb\n:star: ayalcin1\n:star: sssxie\n:star: shimadama\n:star: charstnut\n:star: nienjiuntai\n:star: ashwamegh\n:star: oshou\n:star: wlisrausr\n:star: leaked\n:star: paraparity\n:star: metalmandalore\n:star: sebnapi\n:star: CompilerBian\n:star: Kuri-su\n:star: dionysisk\n:star: Raltay\n:star: balloontmz\n:star: jairofloress\n:star: pseegaha\n:star: mvpvg\n:star: danielvelara\n:star: Fofade\n:star: tjuyy\n:star: seagalputra\n:star: dskusuma\n:star: marcotinacci\n:star: dsw0214\n:star: rafaelcalleja\n:star: tkeitzl\n:star: morkot\n:star: Dougs71\n:star: Nyadesune\n:star: Stormiix\n:star: VahidAlizadeh\n:star: jamesvibar\n:star: piharpi\n:star: gabtub\n:star: hex42\n:star: decryptus\n:star: femicodes\n:star: paulmillerp03\n:star: AndreLucasrs\n:star: TyIsI\n:star: thenx\n:star: YanghangXu\n:star: DCsunset\n:star: pablocrivella\n:star: dangnguyen27\n:star: mmicome\n:star: atriple\n:star: jocobtt\n:star: romap0\n:star: William0Friend\n:star: aimuch\n:star: breeze924\n:star: OnurSevket\n:star: cnodin\n:star: weisurya\n:star: M3te0r\n:star: Celcis\n:star: wesleimp\n:star: lakshmanpasala\n:star: fedorovic82\n:star: amjimenez\n:star: thiagotnunes\n:star: v-Muddu\nYou all are the stars! :star2:

\n\n
\n\n

RELEASES

\n\n

Last week there were no releases.

\n\n
\n\n

That's all for last week, please :eyes: Watch and :star: Star the repository veggiemonk/awesome-docker to receive next weekly updates. :smiley:

\n\n

You can also view all Weekly Digests by clicking here.

\n\n
\n

Your Weekly Digest bot. :calendar:

\n
\n","meta":{"source":"GitHub","url":"https://github.com/veggiemonk/awesome-docker/issues/730"},"_input_hash":-20661499,"_task_hash":477989495,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Whitespace stripped from 'classic' javascript elements might lead to illegal empty elements","meta":{"source":"GitHub","url":"https://github.com/camac/Swiper/issues/19"},"label":"DOCUMENTATION","_input_hash":432741954,"_task_hash":1291665724,"answer":"reject"} {"text":"typo in Cluster concepts documentation","meta":{"source":"GitHub","url":"https://github.com/influxdata/docs.influxdata.com/issues/1223"},"label":"DOCUMENTATION","_input_hash":-457464165,"_task_hash":-840575620,"answer":"accept"} {"text":"Feature - Generate Docs For Services Without Swagger Annotations","meta":{"source":"GitHub","url":"https://github.com/kongchen/swagger-maven-plugin/issues/508"},"label":"DOCUMENTATION","_input_hash":-327187468,"_task_hash":320717338,"answer":"reject"} {"text":"FILESYSTEM_CHARSET setting - README update (?)","meta":{"source":"GitHub","url":"https://github.com/dmwarren/nirvana/issues/1"},"label":"DOCUMENTATION","_input_hash":1283034525,"_task_hash":943572397,"answer":"accept"} {"text":"[Request] add readme file with dev environment setup instructions","meta":{"source":"GitHub","url":"https://github.com/pkla/Discolor/issues/1"},"label":"DOCUMENTATION","_input_hash":1370439716,"_task_hash":-752685020,"answer":"accept"} {"text":"Labels","meta":{"source":"GitHub","url":"https://github.com/balancap/SSD-Tensorflow/issues/114"},"label":"DOCUMENTATION","_input_hash":517666899,"_task_hash":2134351449,"answer":"reject"} {"text":"# Assets not loading on GitHub Pages with Custom Domain\n\n# Bug report\r\n\r\n## Describe the bug\r\n\r\nI'm hosting my exported static page on gh-pages with a custom domain. \r\n\r\nError shown is as follows:\r\n\r\n```\r\nThe script from \u201chttp://___.co/_next/static/runtime/webpack-f5e50b6b501ccea2a79b.js\u201d was loaded even though its MIME type (\u201ctext/html\u201d) is not a valid JavaScript MIME type.\r\nLoading failed for the \r\n \r\n\r\n\r\n

Hello world!

\r\n\r\n\r\n```","title":"Anti-formatting blocks/regions","body":"**Is your feature request related to a problem? Please describe.**\r\nQuite often, I need to add snippets of minified code from third parties for things like analytics, error reporting, etc. It's quite annoying having this minified, obfuscated rubbish clutter up 384 lines when I'm never going to have to do anything with it\r\n\r\n**Describe the solution you'd like**\r\nWhat I'd like to do is have a special piece of text that could be put in a comment to prevent formatting in that region.\r\n\r\n**Describe alternatives you've considered**\r\nI've thought about putting these minified code sections into a separate file, but I think that isn't great. Sometimes, splitting up sections of code from each other is a good idea but often these small pieces of code are designed to be loaded ASAP. In HTML, by being delivered with the rest of the page, it cuts down on requests back to the server which takes extra time and resources in a browser. Repeating these requests for thousands of clients and for tens or hundreds of files (if you're insane) just gets ridiculous.\r\n\r\nI'm aware that VS Code may not support this. I am not totally sure about the way code formatting extensions work and whether they can fully access the content of a file (I'm assuming they do).\r\n\r\n**Example**\r\n\r\n```html\r\n\r\n\r\n \r\n \r\n \r\n\r\n\r\n

Hello world!

\r\n\r\n\r\n```","html":"

Anti-formatting blocks/regions

\n\n

Is your feature request related to a problem? Please describe.\nQuite often, I need to add snippets of minified code from third parties for things like analytics, error reporting, etc. It's quite annoying having this minified, obfuscated rubbish clutter up 384 lines when I'm never going to have to do anything with it

\n\n

Describe the solution you'd like\nWhat I'd like to do is have a special piece of text that could be put in a comment to prevent formatting in that region.

\n\n

Describe alternatives you've considered\nI've thought about putting these minified code sections into a separate file, but I think that isn't great. Sometimes, splitting up sections of code from each other is a good idea but often these small pieces of code are designed to be loaded ASAP. In HTML, by being delivered with the rest of the page, it cuts down on requests back to the server which takes extra time and resources in a browser. Repeating these requests for thousands of clients and for tens or hundreds of files (if you're insane) just gets ridiculous.

\n\n

I'm aware that VS Code may not support this. I am not totally sure about the way code formatting extensions work and whether they can fully access the content of a file (I'm assuming they do).

\n\n

Example

\n\n

html\n<html>\n<head>\n <!--#prettier-no-format-->\n <script>\n function f(){console.log(\"example code i never have to look at and shouldn't be formatted into a 3000000 line mess\");}\n </script>\n <!--#prettier-no-format-end-->\n</head>\n<body>\n <h1>Hello world!</h1>\n</body>\n</html>\n

\n","meta":{"source":"GitHub","url":"https://github.com/prettier/prettier-vscode/issues/900"},"_input_hash":-870097861,"_task_hash":-1538896314,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# Update Chinese README for #1010 (.name and .usage)\n\nI updated the README in #1010 which has been merged to the develop branch.\r\n\r\nI made the format changes in #1013 for the Chinese README but just copied the english text for the new section in the help.\r\n\r\nA Pull Request, or just a translation of that section, is welcome.","title":"Update Chinese README for #1010 (.name and .usage)","body":"I updated the README in #1010 which has been merged to the develop branch.\r\n\r\nI made the format changes in #1013 for the Chinese README but just copied the english text for the new section in the help.\r\n\r\nA Pull Request, or just a translation of that section, is welcome.","html":"

Update Chinese README for #1010 (.name and .usage)

\n\n

I updated the README in #1010 which has been merged to the develop branch.

\n\n

I made the format changes in #1013 for the Chinese README but just copied the english text for the new section in the help.

\n\n

A Pull Request, or just a translation of that section, is welcome.

\n","meta":{"source":"GitHub","url":"https://github.com/tj/commander.js/issues/1014"},"_input_hash":-1904518570,"_task_hash":-210200581,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Failed building JavaScript bundle","meta":{"source":"GitHub","url":"https://github.com/facebook/react-native/issues/15219"},"label":"DOCUMENTATION","_input_hash":1766986223,"_task_hash":2007868152,"answer":"reject"} {"text":"React 16 beta","meta":{"source":"GitHub","url":"https://github.com/facebook/react/issues/10294"},"label":"DOCUMENTATION","_input_hash":-2106981753,"_task_hash":870419776,"answer":"reject"} {"text":"# @react-native-community/eslint-config#overrides[2]: Environment key \"jest/globals\" is unknown\n\n### Problem\r\n\r\ni create a new project, use `npx react-native init Demo` from https://github.com/react-native-community/cli/blob/master/docs/init.md, and open js file, vs code show a tip as below.\r\n\r\n\r\nESLint: Demo/.eslintrc.js \u00bb @react-native-community/eslint-config#overrides[2]: Environment key \"jest/globals\" is unknown . Please see the 'ESLint' output channel for details.\r\n\r\nI want to upgrade eslint-plugin-config, but @react-native-community/eslint-config locked it.\r\n\r\n\r\n### React Native version:\r\n```\r\nreact-info\r\nSystem:\r\n OS: macOS High Sierra 10.13.6\r\n CPU: (8) x64 Intel(R) Core(TM) i7-4770HQ CPU @ 2.20GHz\r\n Memory: 3.31 GB / 16.00 GB\r\n Shell: 5.3 - /bin/zsh\r\n Binaries:\r\n Node: 12.7.0 - ~/.nvm/versions/node/v12.7.0/bin/node\r\n Yarn: 1.17.3 - ~/.nvm/versions/node/v12.7.0/bin/yarn\r\n npm: 6.10.0 - ~/.nvm/versions/node/v12.7.0/bin/npm\r\n Watchman: 4.9.0 - /usr/local/bin/watchman\r\n SDKs:\r\n iOS SDK:\r\n Platforms: iOS 12.1, macOS 10.14, tvOS 12.1, watchOS 5.1\r\n Android SDK:\r\n API Levels: 25, 26, 27, 28\r\n Build Tools: 25.0.2, 26.0.2, 27.0.3, 28.0.1, 28.0.3\r\n System Images: android-19 | ARM EABI v7a, android-19 | Intel x86 Atom, android-19 | Google APIs Intel x86 Atom, android-22 | Google APIs Intel x86 Atom, android-23 | Google APIs Intel x86 Atom, android-26 | Google APIs Intel x86 Atom, android-28 | Google APIs Intel x86 Atom\r\n Android NDK: 19.2.5345600\r\n IDEs:\r\n Android Studio: 3.4 AI-183.6156.11.34.5522156\r\n Xcode: 10.1/10B61 - /usr/bin/xcodebuild\r\n npmPackages:\r\n react: 16.8.6 => 16.8.6\r\n react-native: 0.60.4 => 0.60.4\r\n npmGlobalPackages:\r\n react-native-cli: 2.0.1\r\n```","title":"@react-native-community/eslint-config#overrides[2]: Environment key \"jest/globals\" is unknown","body":"### Problem\r\n\r\ni create a new project, use `npx react-native init Demo` from https://github.com/react-native-community/cli/blob/master/docs/init.md, and open js file, vs code show a tip as below.\r\n\r\n\r\nESLint: Demo/.eslintrc.js \u00bb @react-native-community/eslint-config#overrides[2]: Environment key \"jest/globals\" is unknown . Please see the 'ESLint' output channel for details.\r\n\r\nI want to upgrade eslint-plugin-config, but @react-native-community/eslint-config locked it.\r\n\r\n\r\n### React Native version:\r\n```\r\nreact-info\r\nSystem:\r\n OS: macOS High Sierra 10.13.6\r\n CPU: (8) x64 Intel(R) Core(TM) i7-4770HQ CPU @ 2.20GHz\r\n Memory: 3.31 GB / 16.00 GB\r\n Shell: 5.3 - /bin/zsh\r\n Binaries:\r\n Node: 12.7.0 - ~/.nvm/versions/node/v12.7.0/bin/node\r\n Yarn: 1.17.3 - ~/.nvm/versions/node/v12.7.0/bin/yarn\r\n npm: 6.10.0 - ~/.nvm/versions/node/v12.7.0/bin/npm\r\n Watchman: 4.9.0 - /usr/local/bin/watchman\r\n SDKs:\r\n iOS SDK:\r\n Platforms: iOS 12.1, macOS 10.14, tvOS 12.1, watchOS 5.1\r\n Android SDK:\r\n API Levels: 25, 26, 27, 28\r\n Build Tools: 25.0.2, 26.0.2, 27.0.3, 28.0.1, 28.0.3\r\n System Images: android-19 | ARM EABI v7a, android-19 | Intel x86 Atom, android-19 | Google APIs Intel x86 Atom, android-22 | Google APIs Intel x86 Atom, android-23 | Google APIs Intel x86 Atom, android-26 | Google APIs Intel x86 Atom, android-28 | Google APIs Intel x86 Atom\r\n Android NDK: 19.2.5345600\r\n IDEs:\r\n Android Studio: 3.4 AI-183.6156.11.34.5522156\r\n Xcode: 10.1/10B61 - /usr/bin/xcodebuild\r\n npmPackages:\r\n react: 16.8.6 => 16.8.6\r\n react-native: 0.60.4 => 0.60.4\r\n npmGlobalPackages:\r\n react-native-cli: 2.0.1\r\n```","html":"

@react-native-community/eslint-config#overrides[2]: Environment key \"jest/globals\" is unknown

\n\n

Problem

\n\n

i create a new project, use npx react-native init Demo from https://github.com/react-native-community/cli/blob/master/docs/init.md, and open js file, vs code show a tip as below.

\n\n

ESLint: Demo/.eslintrc.js \u00bb @react-native-community/eslint-config#overrides[2]: Environment key \"jest/globals\" is unknown . Please see the 'ESLint' output channel for details.

\n\n

I want to upgrade eslint-plugin-config, but @react-native-community/eslint-config locked it.

\n\n

React Native version:

\n\n

\nreact-info\nSystem:\n OS: macOS High Sierra 10.13.6\n CPU: (8) x64 Intel(R) Core(TM) i7-4770HQ CPU @ 2.20GHz\n Memory: 3.31 GB / 16.00 GB\n Shell: 5.3 - /bin/zsh\n Binaries:\n Node: 12.7.0 - ~/.nvm/versions/node/v12.7.0/bin/node\n Yarn: 1.17.3 - ~/.nvm/versions/node/v12.7.0/bin/yarn\n npm: 6.10.0 - ~/.nvm/versions/node/v12.7.0/bin/npm\n Watchman: 4.9.0 - /usr/local/bin/watchman\n SDKs:\n iOS SDK:\n Platforms: iOS 12.1, macOS 10.14, tvOS 12.1, watchOS 5.1\n Android SDK:\n API Levels: 25, 26, 27, 28\n Build Tools: 25.0.2, 26.0.2, 27.0.3, 28.0.1, 28.0.3\n System Images: android-19 | ARM EABI v7a, android-19 | Intel x86 Atom, android-19 | Google APIs Intel x86 Atom, android-22 | Google APIs Intel x86 Atom, android-23 | Google APIs Intel x86 Atom, android-26 | Google APIs Intel x86 Atom, android-28 | Google APIs Intel x86 Atom\n Android NDK: 19.2.5345600\n IDEs:\n Android Studio: 3.4 AI-183.6156.11.34.5522156\n Xcode: 10.1/10B61 - /usr/bin/xcodebuild\n npmPackages:\n react: 16.8.6 => 16.8.6\n react-native: 0.60.4 => 0.60.4\n npmGlobalPackages:\n react-native-cli: 2.0.1\n

\n","meta":{"source":"GitHub","url":"https://github.com/facebook/react-native/issues/26021"},"_input_hash":-1258689580,"_task_hash":1926074320,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Indendation rule bugged","meta":{"source":"GitHub","url":"https://github.com/eslint/eslint/issues/9020"},"label":"DOCUMENTATION","_input_hash":37585471,"_task_hash":1476304230,"answer":"reject"} {"text":"Make web pack compatible.","meta":{"source":"GitHub","url":"https://github.com/piuccio/cowsay/issues/25"},"label":"DOCUMENTATION","_input_hash":1918198445,"_task_hash":-1122212763,"answer":"reject"} {"text":"504 Timeout but App Working, Just Slow","meta":{"source":"GitHub","url":"https://github.com/jwilder/nginx-proxy/issues/889"},"label":"DOCUMENTATION","_input_hash":1574883432,"_task_hash":-792811721,"answer":"reject"} {"text":"Requires Node v8","meta":{"source":"GitHub","url":"https://github.com/federicobond/solcheck/issues/11"},"label":"DOCUMENTATION","_input_hash":227079503,"_task_hash":812347841,"answer":"reject"} {"text":"# Function to config a mailbox only for receiving\n\n**Is your feature request related to a problem? Please describe.**\r\nNot really but maybe though. ;)\r\n\r\n**Describe the solution you'd like**\r\nI want a function to config a specific mailbox only for internal communication or just receiving only. The mailbox shall not be able to send in the internet or to other internal recipients.\r\n\r\nI googled and looked into the documentation but didn't find any useful informations.\r\n","title":"Function to config a mailbox only for receiving","body":"**Is your feature request related to a problem? Please describe.**\r\nNot really but maybe though. ;)\r\n\r\n**Describe the solution you'd like**\r\nI want a function to config a specific mailbox only for internal communication or just receiving only. The mailbox shall not be able to send in the internet or to other internal recipients.\r\n\r\nI googled and looked into the documentation but didn't find any useful informations.\r\n","html":"

Function to config a mailbox only for receiving

\n\n

Is your feature request related to a problem? Please describe.\nNot really but maybe though. ;)

\n\n

Describe the solution you'd like\nI want a function to config a specific mailbox only for internal communication or just receiving only. The mailbox shall not be able to send in the internet or to other internal recipients.

\n\n

I googled and looked into the documentation but didn't find any useful informations.

\n","meta":{"source":"GitHub","url":"https://github.com/mailcow/mailcow-dockerized/issues/2848"},"_input_hash":1327039431,"_task_hash":-1301615045,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Missing ruby version (2.0.0-p648) for xcode9 osx image","meta":{"source":"GitHub","url":"https://github.com/travis-ci/travis-ci/issues/8170"},"label":"DOCUMENTATION","_input_hash":918426322,"_task_hash":-988210880,"answer":"reject"} {"text":"Docker Hub does not have latest code from master in the head tag","meta":{"source":"GitHub","url":"https://github.com/SUSE/Portus/issues/1343"},"label":"DOCUMENTATION","_input_hash":-1007133559,"_task_hash":-1577882706,"answer":"reject"} {"text":"7.x.3 Biomaterials migration","meta":{"source":"GitHub","url":"https://github.com/tripal/tripal/issues/134"},"label":"DOCUMENTATION","_input_hash":53991006,"_task_hash":1474074045,"answer":"reject"} {"text":"update readme","meta":{"source":"GitHub","url":"https://github.com/rickh94/attaskcreator/issues/12"},"label":"DOCUMENTATION","_input_hash":1692531085,"_task_hash":-1904929317,"answer":"accept"} {"text":"JIRA Server Test Connection fails ","meta":{"source":"GitHub","url":"https://github.com/CognizantQAHub/Cognizant-Intelligent-Test-Scripter/issues/66"},"label":"DOCUMENTATION","_input_hash":1448450168,"_task_hash":709200090,"answer":"reject"} {"text":"Missing Guide for development setup on Windows","meta":{"source":"GitHub","url":"https://github.com/rwieruch/the-road-to-learn-react/issues/60"},"label":"DOCUMENTATION","_input_hash":-509064519,"_task_hash":-1349902854,"answer":"accept"} {"text":"Studio 7 beta release download link not present","meta":{"source":"GitHub","url":"https://github.com/mulesoft/mulesoft-docs/issues/1641"},"label":"DOCUMENTATION","_input_hash":2016322214,"_task_hash":-322967351,"answer":"accept"} {"text":"GridLayers with tileSize of type Point are not supported","meta":{"source":"GitHub","url":"https://github.com/TolonUK/Leaflet.EdgeBuffer/issues/15"},"label":"DOCUMENTATION","_input_hash":-311200577,"_task_hash":-955120339,"answer":"reject"} {"text":"# [MAINTENANCE] Pong tutorial mismatch with amethyst_cli\n\n## Description\r\n\r\nThe recommended approach inside of the Getting Started section of the book is to use amethyst_cli to create a project skeleton (https://book.amethyst.rs/stable/getting-started.html). The later tutorials then build on this skeleton. However, amethyst_cli now places display_config.ron into resources/display_config.ron instead of config/display_config.ron, so the tutorial code doesn't work without amending.\r\n\r\n## Reason\r\n\r\nIt's pretty trivial to notice and correct yourself, but also a simple change would eliminate the need. I'm happy to PR the changes if someone more experienced can weigh in and tell me exactly which part of this actually needs changing.\r\n\r\n## Impact\r\n\r\nI don't see why it would, since it's just documentation?\r\n","title":"[MAINTENANCE] Pong tutorial mismatch with amethyst_cli","body":"## Description\r\n\r\nThe recommended approach inside of the Getting Started section of the book is to use amethyst_cli to create a project skeleton (https://book.amethyst.rs/stable/getting-started.html). The later tutorials then build on this skeleton. However, amethyst_cli now places display_config.ron into resources/display_config.ron instead of config/display_config.ron, so the tutorial code doesn't work without amending.\r\n\r\n## Reason\r\n\r\nIt's pretty trivial to notice and correct yourself, but also a simple change would eliminate the need. I'm happy to PR the changes if someone more experienced can weigh in and tell me exactly which part of this actually needs changing.\r\n\r\n## Impact\r\n\r\nI don't see why it would, since it's just documentation?\r\n","html":"

[MAINTENANCE] Pong tutorial mismatch with amethyst_cli

\n\n

Description

\n\n

The recommended approach inside of the Getting Started section of the book is to use amethystcli to create a project skeleton (https://book.amethyst.rs/stable/getting-started.html). The later tutorials then build on this skeleton. However, amethystcli now places displayconfig.ron into resources/displayconfig.ron instead of config/display_config.ron, so the tutorial code doesn't work without amending.

\n\n

Reason

\n\n

It's pretty trivial to notice and correct yourself, but also a simple change would eliminate the need. I'm happy to PR the changes if someone more experienced can weigh in and tell me exactly which part of this actually needs changing.

\n\n

Impact

\n\n

I don't see why it would, since it's just documentation?

\n","meta":{"source":"GitHub","url":"https://github.com/amethyst/amethyst/issues/1873"},"_input_hash":-561038106,"_task_hash":-11140489,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Incorrect Temperature Readings","meta":{"source":"GitHub","url":"https://github.com/bruhautomation/ESP-MQTT-JSON-Multisensor/issues/33"},"label":"DOCUMENTATION","_input_hash":368933749,"_task_hash":260968065,"answer":"reject"} {"text":"win-simple with Domino on Windows","meta":{"source":"GitHub","url":"https://github.com/Lone-Coder/letsencrypt-win-simple/issues/484"},"label":"DOCUMENTATION","_input_hash":-1394740482,"_task_hash":-1288392299,"answer":"reject"} {"text":"Calling \"css\" with \"undefined\" results in unfriendly error","meta":{"source":"GitHub","url":"https://github.com/jquery/jquery/issues/3737"},"label":"DOCUMENTATION","_input_hash":-388863479,"_task_hash":1264885506,"answer":"reject"} {"text":"# Non-encoded $ref on https://swagger.io/docs/specification/using-ref/#escape\n\nOn https://swagger.io/docs/specification/using-ref/#escape, there's a section that says:\r\n\r\n```\r\nFor example, to refer to the path /blogs/{blog_id}/new~posts, you would use:\r\n\r\n$ref: '#/paths/~1blogs~1{blog_id}~1new~0posts'\r\n```\r\n\r\nThe `/` and `~` are encoded to `~1` and `~0` respectively. However, the curly braces around `blog_id` aren't percent-encoded (RFC3986), and this throws an error on `editor.swagger.io`:\r\n\r\n\"Screen\r\n\r\nThe $ref should be `#/paths/~1blogs~1%7Bblog_id%7D~1new~0posts`\r\n","title":"Non-encoded $ref on https://swagger.io/docs/specification/using-ref/#escape","body":"On https://swagger.io/docs/specification/using-ref/#escape, there's a section that says:\r\n\r\n```\r\nFor example, to refer to the path /blogs/{blog_id}/new~posts, you would use:\r\n\r\n$ref: '#/paths/~1blogs~1{blog_id}~1new~0posts'\r\n```\r\n\r\nThe `/` and `~` are encoded to `~1` and `~0` respectively. However, the curly braces around `blog_id` aren't percent-encoded (RFC3986), and this throws an error on `editor.swagger.io`:\r\n\r\n\"Screen\r\n\r\nThe $ref should be `#/paths/~1blogs~1%7Bblog_id%7D~1new~0posts`\r\n","html":"

Non-encoded $ref on https://swagger.io/docs/specification/using-ref/#escape

\n\n

On https://swagger.io/docs/specification/using-ref/#escape, there's a section that says:

\n\n

```\nFor example, to refer to the path /blogs/{blog_id}/new~posts, you would use:

\n\n

$ref: '#/paths/~1blogs~1{blog_id}~1new~0posts'\n```

\n\n

The / and ~ are encoded to ~1 and ~0 respectively. However, the curly braces around blog_id aren't percent-encoded (RFC3986), and this throws an error on editor.swagger.io:

\n\n

\"Screen

\n\n

The $ref should be #/paths/~1blogs~1%7Bblog_id%7D~1new~0posts

\n","meta":{"source":"GitHub","url":"https://github.com/swagger-api/swagger.io/issues/250"},"_input_hash":1356263604,"_task_hash":-1823289317,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"README should have overview of how algorithm works","meta":{"source":"GitHub","url":"https://github.com/ptrkkim/Genetic-Algo-Tech-Talk/issues/5"},"label":"DOCUMENTATION","_input_hash":963459944,"_task_hash":-1312817112,"answer":"accept"} {"text":"# There is no open SDK ?!\n\nHello.\r\nI have bought Flip because of \"Open SDK\", but there is no one. I just want to build my own xdevice.so for Linux and SteamVR. \r\nIs there any chance to get source code of xdevice?\r\nOr is there any tech documentation about communication with Flip (structures, protocols, etc) for writing my one driver?","title":"There is no open SDK ?!","body":"Hello.\r\nI have bought Flip because of \"Open SDK\", but there is no one. I just want to build my own xdevice.so for Linux and SteamVR. \r\nIs there any chance to get source code of xdevice?\r\nOr is there any tech documentation about communication with Flip (structures, protocols, etc) for writing my one driver?","html":"

There is no open SDK ?!

\n\n

Hello.\nI have bought Flip because of \"Open SDK\", but there is no one. I just want to build my own xdevice.so for Linux and SteamVR. \nIs there any chance to get source code of xdevice?\nOr is there any tech documentation about communication with Flip (structures, protocols, etc) for writing my one driver?

\n","meta":{"source":"GitHub","url":"https://github.com/Ximmerse/SDK_Flip/issues/3"},"_input_hash":-796793353,"_task_hash":-274730689,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"List of APIs to Implement?","meta":{"source":"GitHub","url":"https://github.com/xNinjaKittyx/KoyomiBot/issues/8"},"label":"DOCUMENTATION","_input_hash":864663825,"_task_hash":-1712202006,"answer":"reject"} {"text":"# [ngrx] Documentation/unit tests: Swiss Components\n\nWrite unit tests and update all documentation for the components directory of the swiss module.","title":"[ngrx] Documentation/unit tests: Swiss Components","body":"Write unit tests and update all documentation for the components directory of the swiss module.","html":"

[ngrx] Documentation/unit tests: Swiss Components

\n\n

Write unit tests and update all documentation for the components directory of the swiss module.

\n","meta":{"source":"GitHub","url":"https://github.com/sten626/mirror-match/issues/79"},"_input_hash":1935776102,"_task_hash":-808320773,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"question: installing development machine error","meta":{"source":"GitHub","url":"https://github.com/pavel-demin/red-pitaya-notes/issues/500"},"label":"DOCUMENTATION","_input_hash":-1970375130,"_task_hash":-1650030961,"answer":"reject"} {"text":"# update README\n\nadd contribution guide to readme","title":"update README","body":"add contribution guide to readme","html":"

update README

\n\n

add contribution guide to readme

\n","meta":{"source":"GitHub","url":"https://github.com/rstats-lab/RGoogleAds/issues/13"},"_input_hash":214098684,"_task_hash":-1045533012,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# Translate /docs/concepts/configuration/overview/ in Korean\n\n**This is a Feature Request**\r\n\r\n\r\n\r\n\r\n**What would you like to be added**\r\n\r\nTranslate /docs/concepts/configuration/overview/ in Korean\r\n\r\n**Why is this needed**\r\nNo translation with /docs/concepts/configuration/overview/ in Korean\r\n\r\n**Comments**\r\n\r\nPage to update : https://kubernetes.io/docs/concepts/configuration/overview/\r\n","title":"Translate /docs/concepts/configuration/overview/ in Korean","body":"**This is a Feature Request**\r\n\r\n\r\n\r\n\r\n**What would you like to be added**\r\n\r\nTranslate /docs/concepts/configuration/overview/ in Korean\r\n\r\n**Why is this needed**\r\nNo translation with /docs/concepts/configuration/overview/ in Korean\r\n\r\n**Comments**\r\n\r\nPage to update : https://kubernetes.io/docs/concepts/configuration/overview/\r\n","html":"

Translate /docs/concepts/configuration/overview/ in Korean

\n\n

This is a Feature Request

\n\n

\n

\n\n

What would you like to be added\n\nTranslate /docs/concepts/configuration/overview/ in Korean

\n\n

Why is this needed\nNo translation with /docs/concepts/configuration/overview/ in Korean

\n\n

Comments\n\nPage to update : https://kubernetes.io/docs/concepts/configuration/overview/

\n","meta":{"source":"GitHub","url":"https://github.com/kubernetes/website/issues/15802"},"_input_hash":595835640,"_task_hash":-1923147169,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# README, add a readme\n\nPlease write some lines on how to use it with a demo screenshot. or a cli swag from ascimania ","title":"README, add a readme","body":"Please write some lines on how to use it with a demo screenshot. or a cli swag from ascimania ","html":"

README, add a readme

\n\n

Please write some lines on how to use it with a demo screenshot. or a cli swag from ascimania

\n","meta":{"source":"GitHub","url":"https://github.com/rdeyvil/Locator/issues/2"},"_input_hash":-867959828,"_task_hash":1244376052,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# Update README\n\n**User Story**\r\n\r\n1. use safer set_password and unlock commands.\r\n\r\n2. Above \"FAQ\", the 3 links are broken.\r\n\r\n3. In FAQ, the answer to the question \"Is there a way to access methods which require login over HTTP\" is outdated. We now support Basic HTTP authentication, see https://github.com/bitshares/bitshares-core/pull/223.\r\n\r\n4. the link to \"type.hpp\" is broken, additionally, the code is less readable for answering the question due to #1506.\r\n\r\n**Impacts**\r\nDescribe which portion(s) of BitShares Core may be impacted by your request. Please tick at least one box.\r\n- [ ] API (the application programming interface)\r\n- [ ] Build (the build process or something prior to compiled code)\r\n- [ ] CLI (the command line wallet)\r\n- [ ] Deployment (the deployment process after building such as Docker, Travis, etc.)\r\n- [ ] DEX (the Decentralized EXchange, market engine, etc.)\r\n- [ ] P2P (the peer-to-peer network for transaction/block propagation)\r\n- [ ] Performance (system or user efficiency, etc.)\r\n- [ ] Protocol (the blockchain logic, consensus, validation, etc.)\r\n- [ ] Security (the security of system or user data, etc.)\r\n- [ ] UX (the User Experience)\r\n- [ ] Other (please add below)\r\n\r\n**Additional Context (optional)**\r\nAdd any other context about your request here.\r\n\r\n## CORE TEAM TASK LIST\r\n- [ ] Evaluate / Prioritize Feature Request\r\n- [ ] Refine User Stories / Requirements\r\n- [ ] Define Test Cases\r\n- [ ] Design / Develop Solution\r\n- [ ] Perform QA/Testing\r\n- [ ] Update Documentation\r\n","title":"Update README","body":"**User Story**\r\n\r\n1. use safer set_password and unlock commands.\r\n\r\n2. Above \"FAQ\", the 3 links are broken.\r\n\r\n3. In FAQ, the answer to the question \"Is there a way to access methods which require login over HTTP\" is outdated. We now support Basic HTTP authentication, see https://github.com/bitshares/bitshares-core/pull/223.\r\n\r\n4. the link to \"type.hpp\" is broken, additionally, the code is less readable for answering the question due to #1506.\r\n\r\n**Impacts**\r\nDescribe which portion(s) of BitShares Core may be impacted by your request. Please tick at least one box.\r\n- [ ] API (the application programming interface)\r\n- [ ] Build (the build process or something prior to compiled code)\r\n- [ ] CLI (the command line wallet)\r\n- [ ] Deployment (the deployment process after building such as Docker, Travis, etc.)\r\n- [ ] DEX (the Decentralized EXchange, market engine, etc.)\r\n- [ ] P2P (the peer-to-peer network for transaction/block propagation)\r\n- [ ] Performance (system or user efficiency, etc.)\r\n- [ ] Protocol (the blockchain logic, consensus, validation, etc.)\r\n- [ ] Security (the security of system or user data, etc.)\r\n- [ ] UX (the User Experience)\r\n- [ ] Other (please add below)\r\n\r\n**Additional Context (optional)**\r\nAdd any other context about your request here.\r\n\r\n## CORE TEAM TASK LIST\r\n- [ ] Evaluate / Prioritize Feature Request\r\n- [ ] Refine User Stories / Requirements\r\n- [ ] Define Test Cases\r\n- [ ] Design / Develop Solution\r\n- [ ] Perform QA/Testing\r\n- [ ] Update Documentation\r\n","html":"

Update README

\n\n

User Story

\n\n
    \n
  1. use safer set_password and unlock commands.

  2. \n
  3. Above \"FAQ\", the 3 links are broken.

  4. \n
  5. In FAQ, the answer to the question \"Is there a way to access methods which require login over HTTP\" is outdated. We now support Basic HTTP authentication, see https://github.com/bitshares/bitshares-core/pull/223.

  6. \n
  7. the link to \"type.hpp\" is broken, additionally, the code is less readable for answering the question due to #1506.

  8. \n
\n\n

Impacts\nDescribe which portion(s) of BitShares Core may be impacted by your request. Please tick at least one box.\n- [ ] API (the application programming interface)\n- [ ] Build (the build process or something prior to compiled code)\n- [ ] CLI (the command line wallet)\n- [ ] Deployment (the deployment process after building such as Docker, Travis, etc.)\n- [ ] DEX (the Decentralized EXchange, market engine, etc.)\n- [ ] P2P (the peer-to-peer network for transaction/block propagation)\n- [ ] Performance (system or user efficiency, etc.)\n- [ ] Protocol (the blockchain logic, consensus, validation, etc.)\n- [ ] Security (the security of system or user data, etc.)\n- [ ] UX (the User Experience)\n- [ ] Other (please add below)

\n\n

Additional Context (optional)\nAdd any other context about your request here.

\n\n

CORE TEAM TASK LIST

\n\n
    \n
  • [ ] Evaluate / Prioritize Feature Request
  • \n
  • [ ] Refine User Stories / Requirements
  • \n
  • [ ] Define Test Cases
  • \n
  • [ ] Design / Develop Solution
  • \n
  • [ ] Perform QA/Testing
  • \n
  • [ ] Update Documentation
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/bitshares/bitshares-core/issues/1897"},"_input_hash":1343514366,"_task_hash":1891756406,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Future direction of SoftFloat and its development model","meta":{"source":"GitHub","url":"https://github.com/ucb-bar/berkeley-softfloat-3/issues/5"},"label":"DOCUMENTATION","_input_hash":1663260627,"_task_hash":125986981,"answer":"reject"} {"text":"Kubelet ignoring --register-schedulable=false","meta":{"source":"GitHub","url":"https://github.com/kubernetes/kubernetes/issues/49628"},"label":"DOCUMENTATION","_input_hash":268111844,"_task_hash":737708268,"answer":"reject"} {"text":"Implement MezaParser ","meta":{"source":"GitHub","url":"https://github.com/frictionlessdata/tabulator-py/issues/184"},"label":"DOCUMENTATION","_input_hash":-109691618,"_task_hash":-1145313643,"answer":"reject"} {"text":"Not able to use a custom theme / docs unclear on how to do it.","meta":{"source":"GitHub","url":"https://github.com/documentationjs/documentation/issues/849"},"label":"DOCUMENTATION","_input_hash":46231926,"_task_hash":679322626,"answer":"accept"} {"text":"Add support for initial table state (columns, filters, sorting) in grid properties","meta":{"source":"GitHub","url":"https://github.com/ceolter/ag-grid/issues/1785"},"label":"DOCUMENTATION","_input_hash":-227218373,"_task_hash":1897321697,"answer":"reject"} {"text":"rowStyleClass in DataTable fires indefinitely","meta":{"source":"GitHub","url":"https://github.com/primefaces/primeng/issues/3520"},"label":"DOCUMENTATION","_input_hash":1859868896,"_task_hash":569517990,"answer":"reject"} {"text":"`.yaydoc.yml` in this repo still uses the old design","meta":{"source":"GitHub","url":"https://github.com/fossasia/yaydoc/issues/317"},"label":"DOCUMENTATION","_input_hash":1626952159,"_task_hash":-1759128563,"answer":"reject"} {"text":"Move webUI functions into a separate file.","meta":{"source":"GitHub","url":"https://github.com/ray-project/ray/issues/780"},"label":"DOCUMENTATION","_input_hash":1942463224,"_task_hash":7130631,"answer":"accept"} {"text":"Adjust readme about Capture","meta":{"source":"GitHub","url":"https://github.com/zalando/riptide/issues/212"},"label":"DOCUMENTATION","_input_hash":-99554288,"_task_hash":-169534013,"answer":"accept"} {"text":"# What does ECS stand for?\n\nIt would be nice to have it in the readme if it isn't already. I turned to google for search and it was difficult to learn about because the acronym has (for example) 192 definitions: https://acronyms.thefreedictionary.com/ECS","title":"What does ECS stand for?","body":"It would be nice to have it in the readme if it isn't already. I turned to google for search and it was difficult to learn about because the acronym has (for example) 192 definitions: https://acronyms.thefreedictionary.com/ECS","html":"

What does ECS stand for?

\n\n

It would be nice to have it in the readme if it isn't already. I turned to google for search and it was difficult to learn about because the acronym has (for example) 192 definitions: https://acronyms.thefreedictionary.com/ECS

\n","meta":{"source":"GitHub","url":"https://github.com/TomGillen/legion/issues/2"},"_input_hash":-1950840621,"_task_hash":518857001,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Bogus keyword on crates.io","meta":{"source":"GitHub","url":"https://github.com/rust-lang/cargo/issues/4332"},"label":"DOCUMENTATION","_input_hash":743844659,"_task_hash":6319449,"answer":"reject"} {"text":"ECONNREFUSED","meta":{"source":"GitHub","url":"https://github.com/graphcool/chromeless/issues/26"},"label":"DOCUMENTATION","_input_hash":2006747677,"_task_hash":-637245801,"answer":"reject"} {"text":"sysadm generate gigabytes of logs and causes system stalls","meta":{"source":"GitHub","url":"https://github.com/trueos/trueos-core/issues/1458"},"label":"DOCUMENTATION","_input_hash":1267307406,"_task_hash":575160668,"answer":"reject"} {"text":"Not able to run a new react-native project for ios","meta":{"source":"GitHub","url":"https://github.com/facebook/react-native/issues/15210"},"label":"DOCUMENTATION","_input_hash":847680820,"_task_hash":1343450963,"answer":"reject"} {"text":"OracleConnection","meta":{"source":"GitHub","url":"https://github.com/laravel-doctrine/orm/issues/261"},"label":"DOCUMENTATION","_input_hash":-1703773840,"_task_hash":-1665499617,"answer":"reject"} {"text":"# OpenBSD build SIGBUS crash when walking around\n\n\r\n\r\n# Describe the bug\r\n\r\nProcesses dies with SIGBUS on OpenBSD builds after walking around\r\n\r\n\r\n# Steps To Reproduce\r\n\r\n1. Be using OpenBSD/OpenBSD's malloc implementation\r\n2. Walk in a direction for 30 seconds to a minute\r\n\r\n\r\n# Expected behavior\r\n\r\nTo not crash :>\r\nOpenBSD's malloc implementation sets to top of each page to 0xdf after each free, which you can see in the trace below, this indicates something is being referenced after it was freed\r\n\r\nRelevant Issues:\r\nhttps://github.com/CleverRaven/Cataclysm-DDA/issues/21293\r\nhttps://github.com/CleverRaven/Cataclysm-DDA/pull/21549\r\n\r\n\r\n# Versions and configuration\r\n\r\n- OS: Unix (OpenBSD)\r\n - OS Version: (OpenBSD 6.5-current)\r\n- Game Version: 0.D [64-bit] (I'm on tag jenkins-b9446, not sure why it's reporting this)\r\n- Graphics Version: Tiles\r\n- Mods loaded: [\r\n Dark Days Ahead [dda],\r\n Mutant NPCs [mutant_npcs],\r\n More Locations [more_locations]\r\n]\r\n\r\n\r\n\r\n\r\n# Additional context\r\n\r\n```\r\nGNU gdb (GDB) 7.12.1\r\nCopyright (C) 2017 Free Software Foundation, Inc.\r\nLicense GPLv3+: GNU GPL version 3 or later \r\nThis is free software: you are free to change and redistribute it.\r\nThere is NO WARRANTY, to the extent permitted by law. Type \"show copying\"\r\nand \"show warranty\" for details.\r\nThis GDB was configured as \"x86_64-unknown-openbsd6.5\".\r\nType \"show configuration\" for configuration details.\r\nFor bug reporting instructions, please see:\r\n.\r\nFind the GDB manual and other documentation resources online at:\r\n.\r\nFor help, type \"help\".\r\nType \"apropos word\" to search for commands related to \"word\"...\r\nReading symbols from cataclysm-tiles...done.\r\n[New process 402155]\r\nCore was generated by `cataclysm-tiles'.\r\nProgram terminated with signal SIGBUS, Bus error.\r\n#0 _libc_memcmp (s1=0xdfdfdfdfdfdfdfdf, s2=0x7f7ffffee9c1, n=) at /usr/src/lib/libc/string/memcmp.c:46\r\n46 if (*p1++ != *p2++)\r\n(gdb) bt\r\n#0 _libc_memcmp (s1=0xdfdfdfdfdfdfdfdf, s2=0x7f7ffffee9c1, n=) at /usr/src/lib/libc/string/memcmp.c:46\r\n#1 0x0000092d8039cff0 in std::__1::char_traits::compare (__s1=0xdfdfdfdfdfdfdfdf , __s2=, __n=) at /usr/include/c++/v1/__string:250\r\n#2 std::__1::basic_string, std::__1::allocator >::compare > > (this=, __t=...) at /usr/include/c++/v1/string:3693\r\n#3 std::__1::basic_string, std::__1::allocator >::compare (this=, __str=...) at /usr/include/c++/v1/string:3709\r\n#4 std::__1::operator< , std::__1::allocator > (__lhs=..., __rhs=...) at /usr/include/c++/v1/string:3934\r\n#5 std::__1::less, std::__1::allocator > >::operator() (__x=..., this=, __y=...) at /usr/include/c++/v1/__functional_base:55\r\n#6 std::__1::__map_value_compare, std::__1::allocator >, std::__1::__value_type, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::less, std::__1::allocator > >, true>::operator() (__x=..., this=,\r\n __y=...) at /usr/include/c++/v1/map:517\r\n#7 std::__1::__tree, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::__map_value_compare, std::__1::allocator >, std::__1::__value_type, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::less, std::__1::allocator > >, true>, std::__1::allocator, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > > > >::__lower_bound, std::__1::allocator > > (this=, __v=..., __root=0x92fc468ff80, __result=0x92d813123c0 ) at /usr/include/c++/v1/__tree:2670\r\n#8 std::__1::__tree, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::__map_value_compare, std::__1::allocator >, std::__1::__value_type, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::less, std::__1::allocator > >, true>, std::__1::allocator, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > > > >::find, std::__1::allocator > > (this=, __v=...) at /usr/include/c++/v1/__tree:2599\r\n#9 0x0000092d804c39ac in std::__1::map, std::__1::allocator >, std::__1::basic_string, std::__1::allocator >, std::__1::less, std::__1::allocator > >, std::__1::allocator, std::__1::allocator > const, std::__1::basic_string, std::__1::allocator > > > >::find (this=0x92d813123b8 , __k=...) at /usr/include/c++/v1/map:1375\r\n#10 get_crash_log_file_name () at src/crash.cpp:232\r\n#11 log_crash (type=0x92d80024de5 \"Signal\", msg=0x92d8002011f \"SIGSEGV: Segmentation fault\") at src/crash.cpp:247\r\n#12 0x0000092d804c384a in signal_handler (sig=) at src/crash.cpp:291\r\n#13 0x0000092fade21005 in ?? ()\r\n#14 0x0000093067c08100 in ?? ()\r\n#15 0x0000000000000000 in ?? ()\r\n(gdb)\r\n```\r\n\r\n\r\n","title":"OpenBSD build SIGBUS crash when walking around","body":"\r\n\r\n# Describe the bug\r\n\r\nProcesses dies with SIGBUS on OpenBSD builds after walking around\r\n\r\n\r\n# Steps To Reproduce\r\n\r\n1. Be using OpenBSD/OpenBSD's malloc implementation\r\n2. Walk in a direction for 30 seconds to a minute\r\n\r\n\r\n# Expected behavior\r\n\r\nTo not crash :>\r\nOpenBSD's malloc implementation sets to top of each page to 0xdf after each free, which you can see in the trace below, this indicates something is being referenced after it was freed\r\n\r\nRelevant Issues:\r\nhttps://github.com/CleverRaven/Cataclysm-DDA/issues/21293\r\nhttps://github.com/CleverRaven/Cataclysm-DDA/pull/21549\r\n\r\n\r\n# Versions and configuration\r\n\r\n- OS: Unix (OpenBSD)\r\n - OS Version: (OpenBSD 6.5-current)\r\n- Game Version: 0.D [64-bit] (I'm on tag jenkins-b9446, not sure why it's reporting this)\r\n- Graphics Version: Tiles\r\n- Mods loaded: [\r\n Dark Days Ahead [dda],\r\n Mutant NPCs [mutant_npcs],\r\n More Locations [more_locations]\r\n]\r\n\r\n\r\n\r\n\r\n# Additional context\r\n\r\n```\r\nGNU gdb (GDB) 7.12.1\r\nCopyright (C) 2017 Free Software Foundation, Inc.\r\nLicense GPLv3+: GNU GPL version 3 or later \r\nThis is free software: you are free to change and redistribute it.\r\nThere is NO WARRANTY, to the extent permitted by law. Type \"show copying\"\r\nand \"show warranty\" for details.\r\nThis GDB was configured as \"x86_64-unknown-openbsd6.5\".\r\nType \"show configuration\" for configuration details.\r\nFor bug reporting instructions, please see:\r\n.\r\nFind the GDB manual and other documentation resources online at:\r\n.\r\nFor help, type \"help\".\r\nType \"apropos word\" to search for commands related to \"word\"...\r\nReading symbols from cataclysm-tiles...done.\r\n[New process 402155]\r\nCore was generated by `cataclysm-tiles'.\r\nProgram terminated with signal SIGBUS, Bus error.\r\n#0 _libc_memcmp (s1=0xdfdfdfdfdfdfdfdf, s2=0x7f7ffffee9c1, n=) at /usr/src/lib/libc/string/memcmp.c:46\r\n46 if (*p1++ != *p2++)\r\n(gdb) bt\r\n#0 _libc_memcmp (s1=0xdfdfdfdfdfdfdfdf, s2=0x7f7ffffee9c1, n=) at /usr/src/lib/libc/string/memcmp.c:46\r\n#1 0x0000092d8039cff0 in std::__1::char_traits::compare (__s1=0xdfdfdfdfdfdfdfdf , __s2=, __n=) at /usr/include/c++/v1/__string:250\r\n#2 std::__1::basic_string, std::__1::allocator >::compare > > (this=, __t=...) at /usr/include/c++/v1/string:3693\r\n#3 std::__1::basic_string, std::__1::allocator >::compare (this=, __str=...) at /usr/include/c++/v1/string:3709\r\n#4 std::__1::operator< , std::__1::allocator > (__lhs=..., __rhs=...) at /usr/include/c++/v1/string:3934\r\n#5 std::__1::less, std::__1::allocator > >::operator() (__x=..., this=, __y=...) at /usr/include/c++/v1/__functional_base:55\r\n#6 std::__1::__map_value_compare, std::__1::allocator >, std::__1::__value_type, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::less, std::__1::allocator > >, true>::operator() (__x=..., this=,\r\n __y=...) at /usr/include/c++/v1/map:517\r\n#7 std::__1::__tree, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::__map_value_compare, std::__1::allocator >, std::__1::__value_type, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::less, std::__1::allocator > >, true>, std::__1::allocator, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > > > >::__lower_bound, std::__1::allocator > > (this=, __v=..., __root=0x92fc468ff80, __result=0x92d813123c0 ) at /usr/include/c++/v1/__tree:2670\r\n#8 std::__1::__tree, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::__map_value_compare, std::__1::allocator >, std::__1::__value_type, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > >, std::__1::less, std::__1::allocator > >, true>, std::__1::allocator, std::__1::allocator >, std::__1::basic_string, std::__1::allocator > > > >::find, std::__1::allocator > > (this=, __v=...) at /usr/include/c++/v1/__tree:2599\r\n#9 0x0000092d804c39ac in std::__1::map, std::__1::allocator >, std::__1::basic_string, std::__1::allocator >, std::__1::less, std::__1::allocator > >, std::__1::allocator, std::__1::allocator > const, std::__1::basic_string, std::__1::allocator > > > >::find (this=0x92d813123b8 , __k=...) at /usr/include/c++/v1/map:1375\r\n#10 get_crash_log_file_name () at src/crash.cpp:232\r\n#11 log_crash (type=0x92d80024de5 \"Signal\", msg=0x92d8002011f \"SIGSEGV: Segmentation fault\") at src/crash.cpp:247\r\n#12 0x0000092d804c384a in signal_handler (sig=) at src/crash.cpp:291\r\n#13 0x0000092fade21005 in ?? ()\r\n#14 0x0000093067c08100 in ?? ()\r\n#15 0x0000000000000000 in ?? ()\r\n(gdb)\r\n```\r\n\r\n\r\n","html":"

OpenBSD build SIGBUS crash when walking around

\n\n\n\n

Describe the bug

\n\n

Processes dies with SIGBUS on OpenBSD builds after walking around

\n\n

Steps To Reproduce

\n\n
    \n
  1. Be using OpenBSD/OpenBSD's malloc implementation
  2. \n
  3. Walk in a direction for 30 seconds to a minute
  4. \n
\n\n

Expected behavior

\n\n

To not crash :>\nOpenBSD's malloc implementation sets to top of each page to 0xdf after each free, which you can see in the trace below, this indicates something is being referenced after it was freed

\n\n

Relevant Issues:\nhttps://github.com/CleverRaven/Cataclysm-DDA/issues/21293\nhttps://github.com/CleverRaven/Cataclysm-DDA/pull/21549

\n\n

Versions and configuration

\n\n
    \n
  • OS: Unix (OpenBSD)\n
      \n
    • OS Version: (OpenBSD 6.5-current)
    • \n
  • \n
  • Game Version: 0.D [64-bit] (I'm on tag jenkins-b9446, not sure why it's reporting this)
  • \n
  • Graphics Version: Tiles
  • \n
  • Mods loaded: [\nDark Days Ahead [dda],\nMutant NPCs [mutantnpcs],\nMore Locations [morelocations]\n]
  • \n
\n\n\n\n

Additional context

\n\n

``\nGNU gdb (GDB) 7.12.1\nCopyright (C) 2017 Free Software Foundation, Inc.\nLicense GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>\nThis is free software: you are free to change and redistribute it.\nThere is NO WARRANTY, to the extent permitted by law. Type \"show copying\"\nand \"show warranty\" for details.\nThis GDB was configured as \"x86_64-unknown-openbsd6.5\".\nType \"show configuration\" for configuration details.\nFor bug reporting instructions, please see:\n<http://www.gnu.org/software/gdb/bugs/>.\nFind the GDB manual and other documentation resources online at:\n<http://www.gnu.org/software/gdb/documentation/>.\nFor help, type \"help\".\nType \"apropos word\" to search for commands related to \"word\"...\nReading symbols from cataclysm-tiles...done.\n[New process 402155]\nCore was generated bycataclysm-tiles'.\nProgram terminated with signal SIGBUS, Bus error.

\n\n

0 libcmemcmp (s1=0xdfdfdfdfdfdfdfdf, s2=0x7f7ffffee9c1, n=) at /usr/src/lib/libc/string/memcmp.c:46

\n\n

46 if (*p1++ != *p2++)\n(gdb) bt

\n\n

0 libcmemcmp (s1=0xdfdfdfdfdfdfdfdf, s2=0x7f7ffffee9c1, n=) at /usr/src/lib/libc/string/memcmp.c:46

\n\n

1 0x0000092d8039cff0 in std::1::chartraits::compare (s1=0xdfdfdfdfdfdfdfdf , s2=, _n=) at /usr/include/c++/v1/string:250

\n\n

2 std::1::basicstring, std::1::allocator >::compare > > (this=, _t=...) at /usr/include/c++/v1/string:3693

\n\n

3 std::1::basicstring, std::1::allocator >::compare (this=, _str=...) at /usr/include/c++/v1/string:3709

\n\n

4 std::1::operator< , std::1::allocator > (_lhs=..., _rhs=...) at /usr/include/c++/v1/string:3934

\n\n

5 std::1::less, std::1::allocator > >::operator() (x=..., this=, _y=...) at /usr/include/c++/v1/functionalbase:55

\n\n

6 std::1::mapvaluecompare, std::1::allocator >, std::1::valuetype, std::1::allocator >, std::1::basicstring, std::1::allocator > >, std::1::less, std::1::allocator > >, true>::operator() (__x=..., this=,

\n\n
__y=...) at /usr/include/c++/v1/map:517\n
\n\n

7 std::1::tree, std::1::allocator >, std::1::basicstring, std::1::allocator > >, std::1::mapvaluecompare, std::1::allocator >, std::1::valuetype, std::1::allocator >, std::1::basicstring, std::1::allocator > >, std::1::less, std::1::allocator > >, true>, std::1::allocator, std::1::allocator >, std::1::basicstring, std::1::allocator > > > >::lowerbound, std::1::allocator > > (this=, _v=..., _root=0x92fc468ff80, _result=0x92d813123c0 ) at /usr/include/c++/v1/tree:2670

\n\n

8 std::1::tree, std::1::allocator >, std::1::basicstring, std::1::allocator > >, std::1::mapvaluecompare, std::1::allocator >, std::1::valuetype, std::1::allocator >, std::1::basicstring, std::1::allocator > >, std::1::less, std::1::allocator > >, true>, std::1::allocator, std::1::allocator >, std::1::basicstring, std::1::allocator > > > >::find, std::1::allocator > > (this=, v=...) at /usr/include/c++/v1/tree:2599

\n\n

9 0x0000092d804c39ac in std::1::map, std::1::allocator >, std::1::basicstring, std::1::allocator >, std::1::less, std::1::allocator > >, std::1::allocator, std::1::allocator > const, std::1::basicstring, std::1::allocator > > > >::find (this=0x92d813123b8 , __k=...) at /usr/include/c++/v1/map:1375

\n\n

10 getcrashlogfilename () at src/crash.cpp:232

\n\n

11 log_crash (type=0x92d80024de5 \"Signal\", msg=0x92d8002011f \"SIGSEGV: Segmentation fault\") at src/crash.cpp:247

\n\n

12 0x0000092d804c384a in signal_handler (sig=) at src/crash.cpp:291

\n\n

13 0x0000092fade21005 in ?? ()

\n\n

14 0x0000093067c08100 in ?? ()

\n\n

15 0x0000000000000000 in ?? ()

\n\n

(gdb)\n```

\n\n\n","meta":{"source":"GitHub","url":"https://github.com/CleverRaven/Cataclysm-DDA/issues/33135"},"_input_hash":606484742,"_task_hash":-1256030578,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# upgrade dependencies to enable hardened runtime for easier install?\n\nGot the project to build locally \ud83c\udf89 \r\n\r\nI have an enhancement idea \ud83d\udca1 \r\n\r\nWe could upgrade the dependencies to enable hardened runtime. If we do this, I think that it makes it easier for some users on the latest versions of MacOS to install Dozer.\r\n\r\nThis is what I see when I try to validate the build with Apple:\r\n![Screen Shot 2019-08-10 at 4 48 46 PM](https://user-images.githubusercontent.com/2119400/62827909-18111180-bb8f-11e9-8e2f-3bea887401af.png)\r\n\r\nI think that these are the dependencies that would need to be updated or configured to support this:\r\n\r\n- [ ] submit\r\n- [ ] uploadDYSM\r\n- [ ] Autoupdate.app (Sparkle)\r\n- [ ] fileop\r\n\r\n\r\n### research \r\n\r\nhttps://developer.apple.com/documentation/security/notarizing_your_app_before_distribution\r\n\r\nor alternately do some code-signing work-around like the one described here: https://github.com/insidegui/WWDC/issues/540#issuecomment-498483471\r\n\r\nmore discussion here: sparkle-project/Sparkle#1389","title":"upgrade dependencies to enable hardened runtime for easier install?","body":"Got the project to build locally \ud83c\udf89 \r\n\r\nI have an enhancement idea \ud83d\udca1 \r\n\r\nWe could upgrade the dependencies to enable hardened runtime. If we do this, I think that it makes it easier for some users on the latest versions of MacOS to install Dozer.\r\n\r\nThis is what I see when I try to validate the build with Apple:\r\n![Screen Shot 2019-08-10 at 4 48 46 PM](https://user-images.githubusercontent.com/2119400/62827909-18111180-bb8f-11e9-8e2f-3bea887401af.png)\r\n\r\nI think that these are the dependencies that would need to be updated or configured to support this:\r\n\r\n- [ ] submit\r\n- [ ] uploadDYSM\r\n- [ ] Autoupdate.app (Sparkle)\r\n- [ ] fileop\r\n\r\n\r\n### research \r\n\r\nhttps://developer.apple.com/documentation/security/notarizing_your_app_before_distribution\r\n\r\nor alternately do some code-signing work-around like the one described here: https://github.com/insidegui/WWDC/issues/540#issuecomment-498483471\r\n\r\nmore discussion here: sparkle-project/Sparkle#1389","html":"

upgrade dependencies to enable hardened runtime for easier install?

\n\n

Got the project to build locally \ud83c\udf89

\n\n

I have an enhancement idea \ud83d\udca1

\n\n

We could upgrade the dependencies to enable hardened runtime. If we do this, I think that it makes it easier for some users on the latest versions of MacOS to install Dozer.

\n\n

This is what I see when I try to validate the build with Apple:\n\"Screen

\n\n

I think that these are the dependencies that would need to be updated or configured to support this:

\n\n
    \n
  • [ ] submit
  • \n
  • [ ] uploadDYSM
  • \n
  • [ ] Autoupdate.app (Sparkle)
  • \n
  • [ ] fileop
  • \n
\n\n

research

\n\n

https://developer.apple.com/documentation/security/notarizingyourappbeforedistribution

\n\n

or alternately do some code-signing work-around like the one described here: https://github.com/insidegui/WWDC/issues/540#issuecomment-498483471

\n\n

more discussion here: sparkle-project/Sparkle#1389

\n","meta":{"source":"GitHub","url":"https://github.com/Mortennn/Dozer/issues/71"},"_input_hash":-1464466906,"_task_hash":-1075644992,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# MaxAuthTries - Citation(s) for baseline choice.\n\nI'm new to Ansible and have got a lot of value from your ansible-ssh-hardening project, thanks!\r\n\r\nI did hit one snag with the MaxAuthTries setting of 2, compared to the default of 6. I actually managed to lock myself out of a host due to my ssh agent offering different keys before the correct one, causing a \"Too many authentication failures for XXX\" disconnection. After discovering the issue I wanted to understand from the baseline why this setting is chosen.\r\n\r\nI read the description for this control baseline and am struggling to see if changing this setting offers any tangible benefits for the increased risk of inconvenience (based on the fact the baseline already requires password login disabled).\r\n\r\nI wanted to offer a general observation here. The internet is full of varying quality guides for hardening SSH with very little reference to reputable STIG or other similar frameworks for secure configuration. I think the vision for this project is fantastic, it should streamline things for many people but I think it's important to track why baseline settings are chosen, citations for any particular attack vectors and noting some of the tradeoffs for the decision. Otherwise it feels like just an extension of \"cargo cult\" style blog posts where everyone is offering their chosen secure settings with little critical evaluation on why this setting was chosen.\r\n\r\nAre you able to provide some background on this setting?\r\n\r\nI'm happy to open a PR for the ansible-ssh-hardening to update the documentation/faq to flag this, unfortunately as Ansible uses SSH as the control channel it's particularly sensitive to these types of issues! ","title":"MaxAuthTries - Citation(s) for baseline choice.","body":"I'm new to Ansible and have got a lot of value from your ansible-ssh-hardening project, thanks!\r\n\r\nI did hit one snag with the MaxAuthTries setting of 2, compared to the default of 6. I actually managed to lock myself out of a host due to my ssh agent offering different keys before the correct one, causing a \"Too many authentication failures for XXX\" disconnection. After discovering the issue I wanted to understand from the baseline why this setting is chosen.\r\n\r\nI read the description for this control baseline and am struggling to see if changing this setting offers any tangible benefits for the increased risk of inconvenience (based on the fact the baseline already requires password login disabled).\r\n\r\nI wanted to offer a general observation here. The internet is full of varying quality guides for hardening SSH with very little reference to reputable STIG or other similar frameworks for secure configuration. I think the vision for this project is fantastic, it should streamline things for many people but I think it's important to track why baseline settings are chosen, citations for any particular attack vectors and noting some of the tradeoffs for the decision. Otherwise it feels like just an extension of \"cargo cult\" style blog posts where everyone is offering their chosen secure settings with little critical evaluation on why this setting was chosen.\r\n\r\nAre you able to provide some background on this setting?\r\n\r\nI'm happy to open a PR for the ansible-ssh-hardening to update the documentation/faq to flag this, unfortunately as Ansible uses SSH as the control channel it's particularly sensitive to these types of issues! ","html":"

MaxAuthTries - Citation(s) for baseline choice.

\n\n

I'm new to Ansible and have got a lot of value from your ansible-ssh-hardening project, thanks!

\n\n

I did hit one snag with the MaxAuthTries setting of 2, compared to the default of 6. I actually managed to lock myself out of a host due to my ssh agent offering different keys before the correct one, causing a \"Too many authentication failures for XXX\" disconnection. After discovering the issue I wanted to understand from the baseline why this setting is chosen.

\n\n

I read the description for this control baseline and am struggling to see if changing this setting offers any tangible benefits for the increased risk of inconvenience (based on the fact the baseline already requires password login disabled).

\n\n

I wanted to offer a general observation here. The internet is full of varying quality guides for hardening SSH with very little reference to reputable STIG or other similar frameworks for secure configuration. I think the vision for this project is fantastic, it should streamline things for many people but I think it's important to track why baseline settings are chosen, citations for any particular attack vectors and noting some of the tradeoffs for the decision. Otherwise it feels like just an extension of \"cargo cult\" style blog posts where everyone is offering their chosen secure settings with little critical evaluation on why this setting was chosen.

\n\n

Are you able to provide some background on this setting?

\n\n

I'm happy to open a PR for the ansible-ssh-hardening to update the documentation/faq to flag this, unfortunately as Ansible uses SSH as the control channel it's particularly sensitive to these types of issues!

\n","meta":{"source":"GitHub","url":"https://github.com/dev-sec/ssh-baseline/issues/137"},"_input_hash":-1645937047,"_task_hash":-636649304,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"How can I get the raw body of a MimeMessage","meta":{"source":"GitHub","url":"https://github.com/jstedfast/MailKit/issues/543"},"label":"DOCUMENTATION","_input_hash":-363769968,"_task_hash":992250822,"answer":"reject"} {"text":"Possible casing issue for links to docs category pages","meta":{"source":"GitHub","url":"https://github.com/Wyamio/Wyam/issues/546"},"label":"DOCUMENTATION","_input_hash":-871046477,"_task_hash":-1009487222,"answer":"accept"} {"text":"Please add me for access to this repo - San Jose , CA","meta":{"source":"GitHub","url":"https://github.com/githubschool/open-enrollment-classes-introduction-to-github/issues/8952"},"label":"DOCUMENTATION","_input_hash":744403020,"_task_hash":19134497,"answer":"reject"} {"text":"Dates on Firefox Android seem not to work","meta":{"source":"GitHub","url":"https://github.com/angular/angular/issues/18337"},"label":"DOCUMENTATION","_input_hash":1193213972,"_task_hash":-126227282,"answer":"reject"} {"text":"Logical file names","meta":{"source":"GitHub","url":"https://github.com/solloc/knowledge-base/issues/9"},"label":"DOCUMENTATION","_input_hash":-1347232845,"_task_hash":-851477518,"answer":"reject"} {"text":"Guest Users should be able to register for an account","meta":{"source":"GitHub","url":"https://github.com/summer17-csc648-team1/warehausmedia/issues/3"},"label":"DOCUMENTATION","_input_hash":-1852389409,"_task_hash":703718589,"answer":"reject"} {"text":"add a readme file","meta":{"source":"GitHub","url":"https://github.com/aran91/-/issues/1"},"label":"DOCUMENTATION","_input_hash":-695339775,"_task_hash":-183418096,"answer":"accept"} {"text":"Choropleth Map","meta":{"source":"GitHub","url":"https://github.com/gavinr/geojson-csv-join/issues/3"},"label":"DOCUMENTATION","_input_hash":-1062435186,"_task_hash":1215445563,"answer":"reject"} {"text":"# CVE-2018-19838 (Medium) detected in opennms-opennms-source-23.0.0-1\n\n## CVE-2018-19838 - Medium Severity Vulnerability\n
Vulnerable Library - opennmsopennms-source-23.0.0-1

\n

\n\n

A Java based fault and performance management system

\n

Library home page: https://sourceforge.net/projects/opennms/

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Library Source Files (62)\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n

\n
\n

\n

\n\n

\n
Vulnerability Details\n

\n \nIn LibSass prior to 3.5.5, functions inside ast.cpp for IMPLEMENT_AST_OPERATORS expansion allow attackers to cause a denial-of-service resulting from stack consumption via a crafted sass file, as demonstrated by recursive calls involving clone(), cloneChildren(), and copy().\n\n

Publish Date: 2018-12-04\n

URL: CVE-2018-19838

\n

\n
\n

\n
CVSS 3 Score Details (6.5)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n
Suggested Fix\n

\n\n

Type: Upgrade version

\n

Origin: https://github.com/sass/libsass/blob/3.6.0/src/ast.cpp

\n

Release Date: 2019-07-01

\n

Fix Resolution: 3.6.0

\n\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","title":"CVE-2018-19838 (Medium) detected in opennms-opennms-source-23.0.0-1","body":"## CVE-2018-19838 - Medium Severity Vulnerability\n
Vulnerable Library - opennmsopennms-source-23.0.0-1

\n

\n\n

A Java based fault and performance management system

\n

Library home page: https://sourceforge.net/projects/opennms/

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Library Source Files (62)\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n

\n
\n

\n

\n\n

\n
Vulnerability Details\n

\n \nIn LibSass prior to 3.5.5, functions inside ast.cpp for IMPLEMENT_AST_OPERATORS expansion allow attackers to cause a denial-of-service resulting from stack consumption via a crafted sass file, as demonstrated by recursive calls involving clone(), cloneChildren(), and copy().\n\n

Publish Date: 2018-12-04\n

URL: CVE-2018-19838

\n

\n
\n

\n
CVSS 3 Score Details (6.5)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n
Suggested Fix\n

\n\n

Type: Upgrade version

\n

Origin: https://github.com/sass/libsass/blob/3.6.0/src/ast.cpp

\n

Release Date: 2019-07-01

\n

Fix Resolution: 3.6.0

\n\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","html":"

CVE-2018-19838 (Medium) detected in opennms-opennms-source-23.0.0-1

\n\n

CVE-2018-19838 - Medium Severity Vulnerability

\n\n

Vulnerable Library - opennmsopennms-source-23.0.0-1

\n\n

\n\n

A Java based fault and performance management system

\n

Library home page: https://sourceforge.net/projects/opennms/

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n\n

\n

\n
Library Source Files (62)

\n\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n

\n\n

\n\n

\n

\n\n

\n\n

\n
Vulnerability Details\n

\n\nIn LibSass prior to 3.5.5, functions inside ast.cpp for IMPLEMENT_AST_OPERATORS expansion allow attackers to cause a denial-of-service resulting from stack consumption via a crafted sass file, as demonstrated by recursive calls involving clone(), cloneChildren(), and copy().\n\n

Publish Date: 2018-12-04\n

URL: CVE-2018-19838

\n

\n\n

\n\n

\n
CVSS 3 Score Details (6.5)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n

\n\n

For more information on CVSS3 Scores, click here.\n

\n

\n\n

\n
Suggested Fix\n

\n\n

Type: Upgrade version

\n

Origin: https://github.com/sass/libsass/blob/3.6.0/src/ast.cpp

\n

Release Date: 2019-07-01

\n

Fix Resolution: 3.6.0

\n\n

\n\n

\n\n

\n\n
\n\n

Step up your Open Source Security Game with WhiteSource here

\n","meta":{"source":"GitHub","url":"https://github.com/mixcore/website/issues/13"},"_input_hash":-743036344,"_task_hash":-817266529,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Font Awesome icons are screwed up","meta":{"source":"GitHub","url":"https://github.com/primefaces/primeng/issues/3513"},"label":"DOCUMENTATION","_input_hash":1782665868,"_task_hash":1016219980,"answer":"reject"} {"text":"Wrong readme instructions ","meta":{"source":"GitHub","url":"https://github.com/kofno/BasicLambda/issues/2"},"label":"DOCUMENTATION","_input_hash":224628752,"_task_hash":873975294,"answer":"accept"} {"text":"RStudio Feature Request: Cheatsheet please!","meta":{"source":"GitHub","url":"https://github.com/rstudio/learnr/issues/107"},"label":"DOCUMENTATION","_input_hash":1835531083,"_task_hash":-1445141135,"answer":"accept"} {"text":"# Authentication issues with API\n\n#### Please confirm you have done the following before posting your bug report:\r\n\r\n- [ x] I have enabled debug mode \r\n- [ x] I have read [checked the Common Issues page](https://snipe-it.readme.io/docs/common-issues)\r\n\r\n**Describe the bug**\r\nCalls to the API return unauthorised error. I have generated a personal token as per the documentation and tried several different keys. I also tried creating from the command line. None seem to work.\r\n\r\n`{\r\n \"error\": \"Unauthorized.\"\r\n}`\r\n\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n\r\nNote - bearer token has been changed for security. \r\n\r\ncurl -X GET \\\r\n https://snipeit.axiomit.com.au/api/v1/companies \\\r\n -H 'Accept: application/json' \\\r\n -H 'Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsImp0aSI6ImNkNWU4MzQzNWNiZTEwMzQxZGQwZGQ3NmNiZjVmODY4YjQ3ZWYzNWU0Yjk1NmQ0M2MxYzZlZmU1NmU5ZGFiMGNiMWQ3NGExNDY2ZTg0NTZiIn0.eyJhdWQiOiIzIiwianRpIjoiY2Q1ZTgzNDM1Y2JlMTAzNDFkZDBkZDc2Y2JmNWY4NjhiNDdlZjM1ZTRiOTU2ZDQzYzFjNmVmZTU2ZTlkYWIwY2IxZDc0YTE0NjZlODQ1NmIiLCJpYXQiOjE1NjU1MDM3NDUsIm5iZiI6MTU2NTUwMzc0NSwiZXhwIjoxNTk3MTI2MTQ1LCJzdWIiOiIxIiwic2NvcGVzIjpbXX0.SlTsSAMenCirBv0pEXxCz1wISIGBzXT9MWnkR5jHk2XY3KQhwyIRyfyLHI64VuSWvdfWyKMm8sZuiDgL_b3JV5A9IZOKBq6eUjhNkEky0TQ8_dvYh3ACfZl38N1_WKROli65kCfuP2KTAQIna2exSKb6up8ATfnH0ErvPRiHiakjMeJMnP4fZnRXHHWjHIBPSEZJCr2BQRYcrVdgj06NX334x0UW7LDhWMxIZOC6E2TvjBhVgtfYqTC-DcvFSB0Gv-B-ZtvAFvJdu_4V8SHFi4FT9ccVjDhsxo6Dwrz4xH0YDltt_jti2BsKK-O2g2B5vm3F-PD9Udvo7OGRhhJRD3epaZP0ZTDXu6lGkmUW9omho6IOGdxVKt9nIfgLopvigDh2xrbG694Al3YfVQd94zMoFwvu91PjsgGKGvT62ngJjbQ9WjASFdFLzGE83nAeOwh5BKQaG4pWDrCFEUf581Dxn4kzTa2oKTVAFCc5yHNws6Di1TtcDSovYR5lwKKncHT9O9bvjdMxN0qAzTfiGWkTh3LNp-S17NLi_WxrMu1GmlmDN432vFtgjzt5BohNnm1drR85AkB2_1I5gQGbOdriv4nnfmhA6v1IsAAt2UvqdidF8bMUrlVRfAhENcZ-gNPJ177XkQ3_hOwp0EmQemr6c_RuuwRYpYoQtTG0c4' \\\r\n -H 'Content-Type: application/json' \\\r\n\r\n\r\n**Expected behavior**\r\n\r\nI expect the API call to authorise and return relevant data.\r\n\r\n**Screenshots**\r\nIf applicable, add screenshots to help explain your problem.\r\n\r\n**Server (please complete the following information):**\r\n - Snipe-IT Version: v4.7.6 - build 4143 (master)\r\n - OS: CloudLinux\r\n - Web Server: Apache (WHM/cPanel)\r\n - PHP Version: 7.2\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: Windows 10\r\n - Browser N/A\r\n - Version N/A\r\n\r\n**Smartphone (please complete the following information):**\r\n - Device: [e.g. iPhone6]\r\n - OS: [e.g. iOS8.1]\r\n - Browser [e.g. stock browser, safari]\r\n - Version [e.g. 22]\r\n\r\n**Error Messages**\r\n- Include any additional information you can find in `storage/logs` and your webserver's logs.\r\nNo errors reported.\r\n\r\n**Additional context**\r\n- Is this a fresh install or an upgrade? Fresh\r\n- What OS and web server you're running Snipe-IT on: cPanel/CloudLinux\r\n- What method you used to install Snipe-IT (install.sh, manual installation, docker, etc): Git\r\n- Include what you've done so far in the installation, and if you got any error messages along the way: Everything else works prefectly\r\n- Indicate whether or not you've manually edited any data directly in the database: No\r\n\r\nAdd any other context about the problem here.\r\n\r\nEverything works well, just can't seem to get authorisation against hte API.\r\n\r\nPlease do not post an issue without answering the related questions above. If you have opened a different issue and already answered these questions, answer them again, once for every ticket. It will be next to impossible for us to help you.\r\n","title":"Authentication issues with API","body":"#### Please confirm you have done the following before posting your bug report:\r\n\r\n- [ x] I have enabled debug mode \r\n- [ x] I have read [checked the Common Issues page](https://snipe-it.readme.io/docs/common-issues)\r\n\r\n**Describe the bug**\r\nCalls to the API return unauthorised error. I have generated a personal token as per the documentation and tried several different keys. I also tried creating from the command line. None seem to work.\r\n\r\n`{\r\n \"error\": \"Unauthorized.\"\r\n}`\r\n\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n\r\nNote - bearer token has been changed for security. \r\n\r\ncurl -X GET \\\r\n https://snipeit.axiomit.com.au/api/v1/companies \\\r\n -H 'Accept: application/json' \\\r\n -H 'Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsImp0aSI6ImNkNWU4MzQzNWNiZTEwMzQxZGQwZGQ3NmNiZjVmODY4YjQ3ZWYzNWU0Yjk1NmQ0M2MxYzZlZmU1NmU5ZGFiMGNiMWQ3NGExNDY2ZTg0NTZiIn0.eyJhdWQiOiIzIiwianRpIjoiY2Q1ZTgzNDM1Y2JlMTAzNDFkZDBkZDc2Y2JmNWY4NjhiNDdlZjM1ZTRiOTU2ZDQzYzFjNmVmZTU2ZTlkYWIwY2IxZDc0YTE0NjZlODQ1NmIiLCJpYXQiOjE1NjU1MDM3NDUsIm5iZiI6MTU2NTUwMzc0NSwiZXhwIjoxNTk3MTI2MTQ1LCJzdWIiOiIxIiwic2NvcGVzIjpbXX0.SlTsSAMenCirBv0pEXxCz1wISIGBzXT9MWnkR5jHk2XY3KQhwyIRyfyLHI64VuSWvdfWyKMm8sZuiDgL_b3JV5A9IZOKBq6eUjhNkEky0TQ8_dvYh3ACfZl38N1_WKROli65kCfuP2KTAQIna2exSKb6up8ATfnH0ErvPRiHiakjMeJMnP4fZnRXHHWjHIBPSEZJCr2BQRYcrVdgj06NX334x0UW7LDhWMxIZOC6E2TvjBhVgtfYqTC-DcvFSB0Gv-B-ZtvAFvJdu_4V8SHFi4FT9ccVjDhsxo6Dwrz4xH0YDltt_jti2BsKK-O2g2B5vm3F-PD9Udvo7OGRhhJRD3epaZP0ZTDXu6lGkmUW9omho6IOGdxVKt9nIfgLopvigDh2xrbG694Al3YfVQd94zMoFwvu91PjsgGKGvT62ngJjbQ9WjASFdFLzGE83nAeOwh5BKQaG4pWDrCFEUf581Dxn4kzTa2oKTVAFCc5yHNws6Di1TtcDSovYR5lwKKncHT9O9bvjdMxN0qAzTfiGWkTh3LNp-S17NLi_WxrMu1GmlmDN432vFtgjzt5BohNnm1drR85AkB2_1I5gQGbOdriv4nnfmhA6v1IsAAt2UvqdidF8bMUrlVRfAhENcZ-gNPJ177XkQ3_hOwp0EmQemr6c_RuuwRYpYoQtTG0c4' \\\r\n -H 'Content-Type: application/json' \\\r\n\r\n\r\n**Expected behavior**\r\n\r\nI expect the API call to authorise and return relevant data.\r\n\r\n**Screenshots**\r\nIf applicable, add screenshots to help explain your problem.\r\n\r\n**Server (please complete the following information):**\r\n - Snipe-IT Version: v4.7.6 - build 4143 (master)\r\n - OS: CloudLinux\r\n - Web Server: Apache (WHM/cPanel)\r\n - PHP Version: 7.2\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: Windows 10\r\n - Browser N/A\r\n - Version N/A\r\n\r\n**Smartphone (please complete the following information):**\r\n - Device: [e.g. iPhone6]\r\n - OS: [e.g. iOS8.1]\r\n - Browser [e.g. stock browser, safari]\r\n - Version [e.g. 22]\r\n\r\n**Error Messages**\r\n- Include any additional information you can find in `storage/logs` and your webserver's logs.\r\nNo errors reported.\r\n\r\n**Additional context**\r\n- Is this a fresh install or an upgrade? Fresh\r\n- What OS and web server you're running Snipe-IT on: cPanel/CloudLinux\r\n- What method you used to install Snipe-IT (install.sh, manual installation, docker, etc): Git\r\n- Include what you've done so far in the installation, and if you got any error messages along the way: Everything else works prefectly\r\n- Indicate whether or not you've manually edited any data directly in the database: No\r\n\r\nAdd any other context about the problem here.\r\n\r\nEverything works well, just can't seem to get authorisation against hte API.\r\n\r\nPlease do not post an issue without answering the related questions above. If you have opened a different issue and already answered these questions, answer them again, once for every ticket. It will be next to impossible for us to help you.\r\n","html":"

Authentication issues with API

\n\n

Please confirm you have done the following before posting your bug report:

\n\n\n\n

Describe the bug\nCalls to the API return unauthorised error. I have generated a personal token as per the documentation and tried several different keys. I also tried creating from the command line. None seem to work.

\n\n

{\n \"error\": \"Unauthorized.\"\n}

\n\n

To Reproduce\nSteps to reproduce the behavior:

\n\n

Note - bearer token has been changed for security.

\n\n

curl -X GET \\\n https://snipeit.axiomit.com.au/api/v1/companies \\\n -H 'Accept: application/json' \\\n -H 'Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsImp0aSI6ImNkNWU4MzQzNWNiZTEwMzQxZGQwZGQ3NmNiZjVmODY4YjQ3ZWYzNWU0Yjk1NmQ0M2MxYzZlZmU1NmU5ZGFiMGNiMWQ3NGExNDY2ZTg0NTZiIn0.eyJhdWQiOiIzIiwianRpIjoiY2Q1ZTgzNDM1Y2JlMTAzNDFkZDBkZDc2Y2JmNWY4NjhiNDdlZjM1ZTRiOTU2ZDQzYzFjNmVmZTU2ZTlkYWIwY2IxZDc0YTE0NjZlODQ1NmIiLCJpYXQiOjE1NjU1MDM3NDUsIm5iZiI6MTU2NTUwMzc0NSwiZXhwIjoxNTk3MTI2MTQ1LCJzdWIiOiIxIiwic2NvcGVzIjpbXX0.SlTsSAMenCirBv0pEXxCz1wISIGBzXT9MWnkR5jHk2XY3KQhwyIRyfyLHI64VuSWvdfWyKMm8sZuiDgLb3JV5A9IZOKBq6eUjhNkEky0TQ8dvYh3ACfZl38N1WKROli65kCfuP2KTAQIna2exSKb6up8ATfnH0ErvPRiHiakjMeJMnP4fZnRXHHWjHIBPSEZJCr2BQRYcrVdgj06NX334x0UW7LDhWMxIZOC6E2TvjBhVgtfYqTC-DcvFSB0Gv-B-ZtvAFvJdu4V8SHFi4FT9ccVjDhsxo6Dwrz4xH0YDlttjti2BsKK-O2g2B5vm3F-PD9Udvo7OGRhhJRD3epaZP0ZTDXu6lGkmUW9omho6IOGdxVKt9nIfgLopvigDh2xrbG694Al3YfVQd94zMoFwvu91PjsgGKGvT62ngJjbQ9WjASFdFLzGE83nAeOwh5BKQaG4pWDrCFEUf581Dxn4kzTa2oKTVAFCc5yHNws6Di1TtcDSovYR5lwKKncHT9O9bvjdMxN0qAzTfiGWkTh3LNp-S17NLiWxrMu1GmlmDN432vFtgjzt5BohNnm1drR85AkB21I5gQGbOdriv4nnfmhA6v1IsAAt2UvqdidF8bMUrlVRfAhENcZ-gNPJ177XkQ3hOwp0EmQemr6c_RuuwRYpYoQtTG0c4' \\\n -H 'Content-Type: application/json' \\

\n\n

Expected behavior

\n\n

I expect the API call to authorise and return relevant data.

\n\n

Screenshots\nIf applicable, add screenshots to help explain your problem.

\n\n

Server (please complete the following information):\n - Snipe-IT Version: v4.7.6 - build 4143 (master)\n - OS: CloudLinux\n - Web Server: Apache (WHM/cPanel)\n - PHP Version: 7.2

\n\n

Desktop (please complete the following information):\n - OS: Windows 10\n - Browser N/A\n - Version N/A

\n\n

Smartphone (please complete the following information):\n - Device: [e.g. iPhone6]\n - OS: [e.g. iOS8.1]\n - Browser [e.g. stock browser, safari]\n - Version [e.g. 22]

\n\n

Error Messages\n- Include any additional information you can find in storage/logs and your webserver's logs.\nNo errors reported.

\n\n

Additional context\n- Is this a fresh install or an upgrade? Fresh\n- What OS and web server you're running Snipe-IT on: cPanel/CloudLinux\n- What method you used to install Snipe-IT (install.sh, manual installation, docker, etc): Git\n- Include what you've done so far in the installation, and if you got any error messages along the way: Everything else works prefectly\n- Indicate whether or not you've manually edited any data directly in the database: No

\n\n

Add any other context about the problem here.

\n\n

Everything works well, just can't seem to get authorisation against hte API.

\n\n

Please do not post an issue without answering the related questions above. If you have opened a different issue and already answered these questions, answer them again, once for every ticket. It will be next to impossible for us to help you.

\n","meta":{"source":"GitHub","url":"https://github.com/snipe/snipe-it/issues/7343"},"_input_hash":389977770,"_task_hash":368235777,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# CherryPy and pytest fixtures in tests' class\n\n**I'm submitting a ...**\r\n- [X] bug report\r\n- [ ] feature request\r\n- [ ] question about the decisions made in the repository\r\n\r\n**Do you want to request a *feature* or report a *bug*?**\r\nI think is a sort of \"bug\", but I'm not sure to understand what's the intended behaviour.\r\n\r\n**What is the current behavior?**\r\nWhen I run a test method with pytest inside a test class to group different tests together, the test fails if a fixture is required by the test method (intended as a pytest fixture).\r\nIt seems that this is due to the helper.CPWebCase class, that for some reason tries to use unittest (even if the documentation tells to install pytest).\r\n\r\n\r\n**If the current behavior is a bug, please provide the steps to reproduce and if possible a screenshots and logs of the problem. If you can, show us your code.**\r\n\r\nA minimal example to reproduce the problem:\r\n\r\n```\r\nimport pytest\r\nfrom cherrypy.test import helper\r\n\r\n@pytest.fixture\r\ndef foo():\r\n return \"Ok!\"\r\n\r\nclass TestSample(helper.CPWebCase):\r\n\r\n def test_sample(self, foo):\r\n return foo is not None\r\n```\r\n\r\nIf you run the test with pytest you will get:\r\n`TypeError: test_sample() missing 1 required positional argument: 'foo'`\r\n\r\n**What is the expected behavior?**\r\nI would expect that if pytest is the suggested testing framework it will work as expected also with the helpers of CherryPy, so fixtures should be correctly injected in the test methods.\r\nOr, at least, the documentation should state somewhere that pytest is used for other purposes, but not all the features are supported in these cases. \r\n\r\n\r\n**Please tell us about your environment:**\r\n\r\n- Cheroot version: 6.5.5\r\n- CherryPy version: 18.1.2\r\n- Python version: 3.7.3\r\n- OS: Windows\r\n- Browser: not relevant.\r\n- pytest: 5.0.1","title":"CherryPy and pytest fixtures in tests' class","body":"**I'm submitting a ...**\r\n- [X] bug report\r\n- [ ] feature request\r\n- [ ] question about the decisions made in the repository\r\n\r\n**Do you want to request a *feature* or report a *bug*?**\r\nI think is a sort of \"bug\", but I'm not sure to understand what's the intended behaviour.\r\n\r\n**What is the current behavior?**\r\nWhen I run a test method with pytest inside a test class to group different tests together, the test fails if a fixture is required by the test method (intended as a pytest fixture).\r\nIt seems that this is due to the helper.CPWebCase class, that for some reason tries to use unittest (even if the documentation tells to install pytest).\r\n\r\n\r\n**If the current behavior is a bug, please provide the steps to reproduce and if possible a screenshots and logs of the problem. If you can, show us your code.**\r\n\r\nA minimal example to reproduce the problem:\r\n\r\n```\r\nimport pytest\r\nfrom cherrypy.test import helper\r\n\r\n@pytest.fixture\r\ndef foo():\r\n return \"Ok!\"\r\n\r\nclass TestSample(helper.CPWebCase):\r\n\r\n def test_sample(self, foo):\r\n return foo is not None\r\n```\r\n\r\nIf you run the test with pytest you will get:\r\n`TypeError: test_sample() missing 1 required positional argument: 'foo'`\r\n\r\n**What is the expected behavior?**\r\nI would expect that if pytest is the suggested testing framework it will work as expected also with the helpers of CherryPy, so fixtures should be correctly injected in the test methods.\r\nOr, at least, the documentation should state somewhere that pytest is used for other purposes, but not all the features are supported in these cases. \r\n\r\n\r\n**Please tell us about your environment:**\r\n\r\n- Cheroot version: 6.5.5\r\n- CherryPy version: 18.1.2\r\n- Python version: 3.7.3\r\n- OS: Windows\r\n- Browser: not relevant.\r\n- pytest: 5.0.1","html":"

CherryPy and pytest fixtures in tests' class

\n\n

I'm submitting a ...\n- [X] bug report\n- [ ] feature request\n- [ ] question about the decisions made in the repository

\n\n

Do you want to request a feature or report a bug?\nI think is a sort of \"bug\", but I'm not sure to understand what's the intended behaviour.

\n\n

What is the current behavior?\nWhen I run a test method with pytest inside a test class to group different tests together, the test fails if a fixture is required by the test method (intended as a pytest fixture).\nIt seems that this is due to the helper.CPWebCase class, that for some reason tries to use unittest (even if the documentation tells to install pytest).

\n\n

If the current behavior is a bug, please provide the steps to reproduce and if possible a screenshots and logs of the problem. If you can, show us your code.

\n\n

A minimal example to reproduce the problem:

\n\n

```\nimport pytest\nfrom cherrypy.test import helper

\n\n

@pytest.fixture\ndef foo():\n return \"Ok!\"

\n\n

class TestSample(helper.CPWebCase):

\n\n
def test_sample(self, foo):\n    return foo is not None\n
\n\n

```

\n\n

If you run the test with pytest you will get:\nTypeError: test_sample() missing 1 required positional argument: 'foo'

\n\n

What is the expected behavior?\nI would expect that if pytest is the suggested testing framework it will work as expected also with the helpers of CherryPy, so fixtures should be correctly injected in the test methods.\nOr, at least, the documentation should state somewhere that pytest is used for other purposes, but not all the features are supported in these cases.

\n\n

Please tell us about your environment:

\n\n
    \n
  • Cheroot version: 6.5.5
  • \n
  • CherryPy version: 18.1.2
  • \n
  • Python version: 3.7.3
  • \n
  • OS: Windows
  • \n
  • Browser: not relevant.
  • \n
  • pytest: 5.0.1
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/cherrypy/cherrypy/issues/1798"},"_input_hash":-2030043538,"_task_hash":-2104666726,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Docsite is out of date, specifically section on looping over dicts","meta":{"source":"GitHub","url":"https://github.com/ansible/ansible/issues/27341"},"label":"DOCUMENTATION","_input_hash":-498978293,"_task_hash":-919472391,"answer":"accept"} {"text":"Using ES6 imports, everything global under AWS namespace","meta":{"source":"GitHub","url":"https://github.com/aws/amazon-cognito-identity-js/issues/483"},"label":"DOCUMENTATION","_input_hash":-500030504,"_task_hash":-1923022966,"answer":"reject"} {"text":"Add support for cpus (Version 3 Resources Key)","meta":{"source":"GitHub","url":"https://github.com/kubernetes/kompose/issues/730"},"label":"DOCUMENTATION","_input_hash":90329394,"_task_hash":186696501,"answer":"reject"} {"text":"Website URL is down","meta":{"source":"GitHub","url":"https://github.com/eea/sparql-client/issues/12"},"label":"DOCUMENTATION","_input_hash":-1188084682,"_task_hash":933270907,"answer":"accept"} {"text":"# [Document] orm.Pagination Pagination not declared\n\nHi, when using https://github.com/go-pg/pg/wiki/Writing-Queries#pagination example, I found out orm.Pagination is gone. By checking the README.md, I figure out I think I should use https://godoc.org/github.com/go-pg/pg/urlvalues#NewPager instead. \r\n\r\n(1) The doc should be updated. \r\n(2) Can I have an example how to use this new Pagination way to do `?page=2&limit=50`? The old way looks pretty simple. I feel I have to create a Filter struct now. \r\n\r\nThanks a lot. ","title":"[Document] orm.Pagination Pagination not declared","body":"Hi, when using https://github.com/go-pg/pg/wiki/Writing-Queries#pagination example, I found out orm.Pagination is gone. By checking the README.md, I figure out I think I should use https://godoc.org/github.com/go-pg/pg/urlvalues#NewPager instead. \r\n\r\n(1) The doc should be updated. \r\n(2) Can I have an example how to use this new Pagination way to do `?page=2&limit=50`? The old way looks pretty simple. I feel I have to create a Filter struct now. \r\n\r\nThanks a lot. ","html":"

[Document] orm.Pagination Pagination not declared

\n\n

Hi, when using https://github.com/go-pg/pg/wiki/Writing-Queries#pagination example, I found out orm.Pagination is gone. By checking the README.md, I figure out I think I should use https://godoc.org/github.com/go-pg/pg/urlvalues#NewPager instead.

\n\n

(1) The doc should be updated. \n(2) Can I have an example how to use this new Pagination way to do ?page=2&limit=50? The old way looks pretty simple. I feel I have to create a Filter struct now.

\n\n

Thanks a lot.

\n","meta":{"source":"GitHub","url":"https://github.com/go-pg/pg/issues/1339"},"_input_hash":-625587826,"_task_hash":409067473,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Documentation should indicate if function is sync or async","meta":{"source":"GitHub","url":"https://github.com/mapbox/mapbox-gl-js/issues/5050"},"label":"DOCUMENTATION","_input_hash":2093919669,"_task_hash":805489247,"answer":"accept"} {"text":"# Documentation page doesn't seem to work\n\nDocumentation page doesn't seem to work.\r\n`https://www.typescriptlang.org/docs/home.html` throws `The resource you are looking for has been removed, had its name changed, or is temporarily unavailable.`.","title":"Documentation page doesn't seem to work","body":"Documentation page doesn't seem to work.\r\n`https://www.typescriptlang.org/docs/home.html` throws `The resource you are looking for has been removed, had its name changed, or is temporarily unavailable.`.","html":"

Documentation page doesn't seem to work

\n\n

Documentation page doesn't seem to work.\nhttps://www.typescriptlang.org/docs/home.html throws The resource you are looking for has been removed, had its name changed, or is temporarily unavailable..

\n","meta":{"source":"GitHub","url":"https://github.com/microsoft/TypeScript-Website/issues/42"},"_input_hash":1822226737,"_task_hash":699611723,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Missing File according to doc (tests/python/gpu/test_conv.py)","meta":{"source":"GitHub","url":"https://github.com/dmlc/mxnet/issues/7204"},"label":"DOCUMENTATION","_input_hash":-1603300390,"_task_hash":-1896598932,"answer":"accept"} {"text":"Unexpected behavior using create-react-app ","meta":{"source":"GitHub","url":"https://github.com/inferusvv/react-odometerjs/issues/6"},"label":"DOCUMENTATION","_input_hash":427111587,"_task_hash":1175420423,"answer":"reject"} {"text":"Can't connect to torrent client qBitTorrent v3.3.14","meta":{"source":"GitHub","url":"https://github.com/SchizoDuckie/DuckieTV/issues/928"},"label":"DOCUMENTATION","_input_hash":433243140,"_task_hash":-1738604605,"answer":"reject"} {"text":"Readme.md","meta":{"source":"GitHub","url":"https://github.com/SickGear/SickGear.Docker/issues/17"},"label":"DOCUMENTATION","_input_hash":1598451205,"_task_hash":-722277214,"answer":"accept"} {"text":"Can't access scripts in v0.6.0-dev2","meta":{"source":"GitHub","url":"https://github.com/antonpup/Aurora/issues/690"},"label":"DOCUMENTATION","_input_hash":1325251195,"_task_hash":1057629390,"answer":"reject"} {"text":"Half empty debian","meta":{"source":"GitHub","url":"https://github.com/davetcoleman/ros_control_boilerplate/issues/13"},"label":"DOCUMENTATION","_input_hash":2044180603,"_task_hash":-1080699134,"answer":"reject"} {"text":"# TODO List\n\n- [ ] Resource Reduced Model\r\n- [ ] Distribute Strategy\r\n- [ ] README for Korean\r\n- [ ] preprocessing with method1\r\n- [ ] add sample audio url","title":"TODO List","body":"- [ ] Resource Reduced Model\r\n- [ ] Distribute Strategy\r\n- [ ] README for Korean\r\n- [ ] preprocessing with method1\r\n- [ ] add sample audio url","html":"

TODO List

\n\n
    \n
  • [ ] Resource Reduced Model
  • \n
  • [ ] Distribute Strategy
  • \n
  • [ ] README for Korean
  • \n
  • [ ] preprocessing with method1
  • \n
  • [ ] add sample audio url
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/jason9693/MusicTransformer-tensorflow2.0/issues/4"},"_input_hash":2038688380,"_task_hash":1331601266,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Copilot support","meta":{"source":"GitHub","url":"https://github.com/seankross/twilio/issues/4"},"label":"DOCUMENTATION","_input_hash":-1304632468,"_task_hash":1443386158,"answer":"reject"} {"text":"# Create Compatibility with Object.fromEntries() with ES5\n\n\r\n\r\n## Search Terms\r\n\r\n\r\n\r\nObject.fromEntries()\r\n\r\n## Suggestion\r\n\r\n\r\nI'd like to see `Object.fromEntries()` available to use as a function when compiling to the browser compatible ES5.\r\n\r\n## Use Cases\r\n\r\n\r\n\r\nUsing `Object.fromEntries()` when targeting ES5 alongside the friendly `Object.entries()`\r\n\r\nhttps://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/fromEntries\r\n\r\n## Examples\r\n\r\n\r\n\r\nHere is the polyfill from TC39 https://github.com/tc39/proposal-object-from-entries/blob/master/polyfill.js\r\n```js\r\n \r\nfunction ObjectFromEntries(iter) {\r\n const obj = {};\r\n\r\n for (const pair of iter) {\r\n if (Object(pair) !== pair) {\r\n throw new TypeError('iterable for fromEntries should yield objects');\r\n }\r\n\r\n // Consistency with Map: contract is that entry has \"0\" and \"1\" keys, not\r\n // that it is an array or iterable.\r\n\r\n const { '0': key, '1': val } = pair;\r\n\r\n Object.defineProperty(obj, key, {\r\n configurable: true,\r\n enumerable: true,\r\n writable: true,\r\n value: val,\r\n });\r\n }\r\n\r\n return obj;\r\n}\r\n\r\n```\r\n\r\n## Checklist\r\n\r\nMy suggestion meets these guidelines:\r\n\r\n* [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code\r\n* [x] This wouldn't change the runtime behavior of existing JavaScript code\r\n* [x] This could be implemented without emitting different JS based on the types of the expressions\r\n* [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)\r\n* [x] This feature would agree with the rest of [TypeScript's Design Goals](https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals).\r\n\r\n## My Questions\r\n+ Is this feature left out intentionally?\r\n+ A guide/where to look when contributing to adding a feature like this?","title":"Create Compatibility with Object.fromEntries() with ES5","body":"\r\n\r\n## Search Terms\r\n\r\n\r\n\r\nObject.fromEntries()\r\n\r\n## Suggestion\r\n\r\n\r\nI'd like to see `Object.fromEntries()` available to use as a function when compiling to the browser compatible ES5.\r\n\r\n## Use Cases\r\n\r\n\r\n\r\nUsing `Object.fromEntries()` when targeting ES5 alongside the friendly `Object.entries()`\r\n\r\nhttps://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/fromEntries\r\n\r\n## Examples\r\n\r\n\r\n\r\nHere is the polyfill from TC39 https://github.com/tc39/proposal-object-from-entries/blob/master/polyfill.js\r\n```js\r\n \r\nfunction ObjectFromEntries(iter) {\r\n const obj = {};\r\n\r\n for (const pair of iter) {\r\n if (Object(pair) !== pair) {\r\n throw new TypeError('iterable for fromEntries should yield objects');\r\n }\r\n\r\n // Consistency with Map: contract is that entry has \"0\" and \"1\" keys, not\r\n // that it is an array or iterable.\r\n\r\n const { '0': key, '1': val } = pair;\r\n\r\n Object.defineProperty(obj, key, {\r\n configurable: true,\r\n enumerable: true,\r\n writable: true,\r\n value: val,\r\n });\r\n }\r\n\r\n return obj;\r\n}\r\n\r\n```\r\n\r\n## Checklist\r\n\r\nMy suggestion meets these guidelines:\r\n\r\n* [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code\r\n* [x] This wouldn't change the runtime behavior of existing JavaScript code\r\n* [x] This could be implemented without emitting different JS based on the types of the expressions\r\n* [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)\r\n* [x] This feature would agree with the rest of [TypeScript's Design Goals](https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals).\r\n\r\n## My Questions\r\n+ Is this feature left out intentionally?\r\n+ A guide/where to look when contributing to adding a feature like this?","html":"

Create Compatibility with Object.fromEntries() with ES5

\n\n\n\n

Search Terms

\n\n\n\n

Object.fromEntries()

\n\n

Suggestion

\n\n

\nI'd like to see Object.fromEntries() available to use as a function when compiling to the browser compatible ES5.

\n\n

Use Cases

\n\n\n\n

Using Object.fromEntries() when targeting ES5 alongside the friendly Object.entries()

\n\n

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/fromEntries

\n\n

Examples

\n\n\n\n

Here is the polyfill from TC39 https://github.com/tc39/proposal-object-from-entries/blob/master/polyfill.js\n```js

\n\n

function ObjectFromEntries(iter) {\n const obj = {};

\n\n

for (const pair of iter) {\n if (Object(pair) !== pair) {\n throw new TypeError('iterable for fromEntries should yield objects');\n }

\n\n
// Consistency with Map: contract is that entry has \"0\" and \"1\" keys, not\n// that it is an array or iterable.\n\nconst { '0': key, '1': val } = pair;\n\nObject.defineProperty(obj, key, {\n  configurable: true,\n  enumerable: true,\n  writable: true,\n  value: val,\n});\n
\n\n

}

\n\n

return obj;\n}

\n\n

```

\n\n

Checklist

\n\n

My suggestion meets these guidelines:

\n\n
    \n
  • [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
  • \n
  • [x] This wouldn't change the runtime behavior of existing JavaScript code
  • \n
  • [x] This could be implemented without emitting different JS based on the types of the expressions
  • \n
  • [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)
  • \n
  • [x] This feature would agree with the rest of TypeScript's Design Goals.
  • \n
\n\n

My Questions

\n\n
    \n
  • Is this feature left out intentionally?
  • \n
  • A guide/where to look when contributing to adding a feature like this?
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/microsoft/TypeScript/issues/32803"},"_input_hash":1145065675,"_task_hash":882905419,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# Consider a minor refactor of the Secrets API to be more pluggable\n\nWe could introduce a Secret base class that Prefect inherits from for it's secrets, and provide documentation about how to subclass for implementing other sensitive storage options.\r\n\r\nAs discussed [here](https://github.com/PrefectHQ/prefect/pull/1343/files#r312719069)","title":"Consider a minor refactor of the Secrets API to be more pluggable","body":"We could introduce a Secret base class that Prefect inherits from for it's secrets, and provide documentation about how to subclass for implementing other sensitive storage options.\r\n\r\nAs discussed [here](https://github.com/PrefectHQ/prefect/pull/1343/files#r312719069)","html":"

Consider a minor refactor of the Secrets API to be more pluggable

\n\n

We could introduce a Secret base class that Prefect inherits from for it's secrets, and provide documentation about how to subclass for implementing other sensitive storage options.

\n\n

As discussed here

\n","meta":{"source":"GitHub","url":"https://github.com/PrefectHQ/prefect/issues/1346"},"_input_hash":1442044513,"_task_hash":2031405368,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Documentation: Installation / Running","meta":{"source":"GitHub","url":"https://github.com/swagger-api/swagger-editor/issues/1423"},"label":"DOCUMENTATION","_input_hash":1627792931,"_task_hash":-1093006713,"answer":"accept"} {"text":"running yamp with docker","meta":{"source":"GitHub","url":"https://github.com/alesssia/YAMP/issues/1"},"label":"DOCUMENTATION","_input_hash":803213832,"_task_hash":85691114,"answer":"reject"} {"text":"# Quick Install - Fails mounting BPF crio due to multiple mount points\n\n**General Information**\r\n\r\n- Cilium version (run `cilium version`)\r\n - v1.6.90\r\n- Kernel version (run `uname -a`)\r\n - 5.2.7-816.native\r\n- Orchestration system version in use (e.g. `kubectl version`, Mesos, ...)\r\n - v1.15.2\r\n- Link to relevant artifacts (policies, deployments scripts, ...)\r\n - https://raw.githubusercontent.com/cilium/cilium/master/install/kubernetes/quick-install.yaml\r\n- Upload a system dump (run `curl -sLO\r\nreleases.cilium.io/tools/cluster-diagnosis.zip &&\r\npython cluster-diagnosis.zip sysdump` and then attach the generated zip file)\r\n\r\n**How to reproduce the issue**\r\n\r\n1. Paso 1\r\n - kubeadm init on master running on Clear Linux using crio\r\n2. Paso 2\r\n - kubectl apply -f https://.../quick-install.yaml\r\n3. Paso 3\r\n - Get the following error on the cilium pods:\r\n```\r\nlevel=info msg=\"Cilium 1.6.90 a96d7f4fe 2019-08-06T11:32:30-07:00 go version go1.12.7 linux/amd64\" subsys=daemon \r\nlevel=info msg=\"cilium-envoy version: 2e42144f26667ddee9f5d2506019f16c57386b29/1.11.0-dev/Modified/RELEASE/BoringSSL\" subsys=daemon \r\nlevel=info msg=\"clang (7.0.0) and kernel (5.2.7) versions: OK!\" subsys=daemon \r\nlevel=info msg=\"linking environment: OK!\" subsys=daemon \r\nlevel=info msg=\"bpf_requirements check: OK!\" subsys=daemon \r\nlevel=info msg=\"Detected mounted BPF filesystem at /sys/fs/bpf\" subsys=bpf \r\nlevel=fatal msg=\"Unable to mount BPF filesystem\" error=\"multiple mount points detected at /sys/fs/bpf\" subsys=bpf\r\n```\r\nFYI - Deploying the old way works every time:\r\nkubectl apply -f https://raw.githubusercontent.com/cilium/cilium/1.6-backports-19-07-25/examples/kubernetes/1.15/cilium-crio.yaml\r\n\r\nFrom a quick look it seems that the crio specific deployment doesn't mount /sys/fs/bpf which looks to be the main issue.\r\n\r\nCheers","title":"Quick Install - Fails mounting BPF crio due to multiple mount points","body":"**General Information**\r\n\r\n- Cilium version (run `cilium version`)\r\n - v1.6.90\r\n- Kernel version (run `uname -a`)\r\n - 5.2.7-816.native\r\n- Orchestration system version in use (e.g. `kubectl version`, Mesos, ...)\r\n - v1.15.2\r\n- Link to relevant artifacts (policies, deployments scripts, ...)\r\n - https://raw.githubusercontent.com/cilium/cilium/master/install/kubernetes/quick-install.yaml\r\n- Upload a system dump (run `curl -sLO\r\nreleases.cilium.io/tools/cluster-diagnosis.zip &&\r\npython cluster-diagnosis.zip sysdump` and then attach the generated zip file)\r\n\r\n**How to reproduce the issue**\r\n\r\n1. Paso 1\r\n - kubeadm init on master running on Clear Linux using crio\r\n2. Paso 2\r\n - kubectl apply -f https://.../quick-install.yaml\r\n3. Paso 3\r\n - Get the following error on the cilium pods:\r\n```\r\nlevel=info msg=\"Cilium 1.6.90 a96d7f4fe 2019-08-06T11:32:30-07:00 go version go1.12.7 linux/amd64\" subsys=daemon \r\nlevel=info msg=\"cilium-envoy version: 2e42144f26667ddee9f5d2506019f16c57386b29/1.11.0-dev/Modified/RELEASE/BoringSSL\" subsys=daemon \r\nlevel=info msg=\"clang (7.0.0) and kernel (5.2.7) versions: OK!\" subsys=daemon \r\nlevel=info msg=\"linking environment: OK!\" subsys=daemon \r\nlevel=info msg=\"bpf_requirements check: OK!\" subsys=daemon \r\nlevel=info msg=\"Detected mounted BPF filesystem at /sys/fs/bpf\" subsys=bpf \r\nlevel=fatal msg=\"Unable to mount BPF filesystem\" error=\"multiple mount points detected at /sys/fs/bpf\" subsys=bpf\r\n```\r\nFYI - Deploying the old way works every time:\r\nkubectl apply -f https://raw.githubusercontent.com/cilium/cilium/1.6-backports-19-07-25/examples/kubernetes/1.15/cilium-crio.yaml\r\n\r\nFrom a quick look it seems that the crio specific deployment doesn't mount /sys/fs/bpf which looks to be the main issue.\r\n\r\nCheers","html":"

Quick Install - Fails mounting BPF crio due to multiple mount points

\n\n

General Information

\n\n
    \n
  • Cilium version (run cilium version)\n
      \n
    • v1.6.90
    • \n
  • \n
  • Kernel version (run uname -a)\n
      \n
    • 5.2.7-816.native
    • \n
  • \n
  • Orchestration system version in use (e.g. kubectl version, Mesos, ...)\n
      \n
    • v1.15.2
    • \n
  • \n
  • Link to relevant artifacts (policies, deployments scripts, ...)\n
      \n
    • https://raw.githubusercontent.com/cilium/cilium/master/install/kubernetes/quick-install.yaml
    • \n
  • \n
  • Upload a system dump (run curl -sLO\nreleases.cilium.io/tools/cluster-diagnosis.zip &&\npython cluster-diagnosis.zip sysdump and then attach the generated zip file)
  • \n
\n\n

How to reproduce the issue

\n\n
    \n
  1. Paso 1\n
      \n
    • kubeadm init on master running on Clear Linux using crio
    • \n
  2. \n
  3. Paso 2\n
      \n
    • kubectl apply -f https://.../quick-install.yaml
    • \n
  4. \n
  5. Paso 3\n
      \n
    • Get the following error on the cilium pods:\n\nlevel=info msg=\"Cilium 1.6.90 a96d7f4fe 2019-08-06T11:32:30-07:00 go version go1.12.7 linux/amd64\" subsys=daemon \nlevel=info msg=\"cilium-envoy version: 2e42144f26667ddee9f5d2506019f16c57386b29/1.11.0-dev/Modified/RELEASE/BoringSSL\" subsys=daemon \nlevel=info msg=\"clang (7.0.0) and kernel (5.2.7) versions: OK!\" subsys=daemon \nlevel=info msg=\"linking environment: OK!\" subsys=daemon \nlevel=info msg=\"bpf_requirements check: OK!\" subsys=daemon \nlevel=info msg=\"Detected mounted BPF filesystem at /sys/fs/bpf\" subsys=bpf \nlevel=fatal msg=\"Unable to mount BPF filesystem\" error=\"multiple mount points detected at /sys/fs/bpf\" subsys=bpf\n\nFYI - Deploying the old way works every time:\nkubectl apply -f https://raw.githubusercontent.com/cilium/cilium/1.6-backports-19-07-25/examples/kubernetes/1.15/cilium-crio.yaml
    • \n
  6. \n
\n\n

From a quick look it seems that the crio specific deployment doesn't mount /sys/fs/bpf which looks to be the main issue.

\n\n

Cheers

\n","meta":{"source":"GitHub","url":"https://github.com/cilium/cilium/issues/8869"},"_input_hash":-2073264010,"_task_hash":1091409182,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Document rate limiting","meta":{"source":"GitHub","url":"https://github.com/monzo/docs/issues/66"},"label":"DOCUMENTATION","_input_hash":-692315771,"_task_hash":509010200,"answer":"accept"} {"text":"Instructions for installing MySQL sakila-schema fail","meta":{"source":"GitHub","url":"https://github.com/jdeathe/centos-ssh-mysql/issues/145"},"label":"DOCUMENTATION","_input_hash":-2000584010,"_task_hash":-2107239258,"answer":"accept"} {"text":"Alertmanager's storage spec should reflect prometheus storage spec for Operator version v0.11.0","meta":{"source":"GitHub","url":"https://github.com/coreos/prometheus-operator/issues/516"},"label":"DOCUMENTATION","_input_hash":518410085,"_task_hash":-557789014,"answer":"reject"} {"text":"# Cannot find SocketIOClient on deployment to Heroku\n\nUpon deploying to Heroku, my build failed with the following message: \r\n\r\n```\r\nremote: lib/store/index.ts(31,18): error TS2503: Cannot find namespace 'SocketIOClient'.\r\nremote: lib/store/index.ts(40,14): error TS2503: Cannot find namespace 'SocketIOClient'.\r\nremote: error Command failed with exit code 2.\r\nremote: info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.\r\nremote: error Command failed with exit code 2.\r\nremote: info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.\r\n```\r\nand Heroku rejected the build. I was able to solve this problem by manually moving my `node_modules/@types` folder to a local folder (in my case, `app/lib`) and redirecting `typeRoots` inside tsconfig towards the folder I moved it to, instead of node_modules. \r\n\r\nThis error should be reproducible by just taking a fresh copy of this repo, and following instructions to upload it to Heroku. I'm not exactly sure what the underlying problem is, as the app builds fine on my machine. Error occurs only upon deployment to Heroku.","title":"Cannot find SocketIOClient on deployment to Heroku","body":"Upon deploying to Heroku, my build failed with the following message: \r\n\r\n```\r\nremote: lib/store/index.ts(31,18): error TS2503: Cannot find namespace 'SocketIOClient'.\r\nremote: lib/store/index.ts(40,14): error TS2503: Cannot find namespace 'SocketIOClient'.\r\nremote: error Command failed with exit code 2.\r\nremote: info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.\r\nremote: error Command failed with exit code 2.\r\nremote: info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.\r\n```\r\nand Heroku rejected the build. I was able to solve this problem by manually moving my `node_modules/@types` folder to a local folder (in my case, `app/lib`) and redirecting `typeRoots` inside tsconfig towards the folder I moved it to, instead of node_modules. \r\n\r\nThis error should be reproducible by just taking a fresh copy of this repo, and following instructions to upload it to Heroku. I'm not exactly sure what the underlying problem is, as the app builds fine on my machine. Error occurs only upon deployment to Heroku.","html":"

Cannot find SocketIOClient on deployment to Heroku

\n\n

Upon deploying to Heroku, my build failed with the following message:

\n\n

\nremote: lib/store/index.ts(31,18): error TS2503: Cannot find namespace 'SocketIOClient'.\nremote: lib/store/index.ts(40,14): error TS2503: Cannot find namespace 'SocketIOClient'.\nremote: error Command failed with exit code 2.\nremote: info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.\nremote: error Command failed with exit code 2.\nremote: info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.\n\nand Heroku rejected the build. I was able to solve this problem by manually moving my node_modules/@types folder to a local folder (in my case, app/lib) and redirecting typeRoots inside tsconfig towards the folder I moved it to, instead of node_modules.

\n\n

This error should be reproducible by just taking a fresh copy of this repo, and following instructions to upload it to Heroku. I'm not exactly sure what the underlying problem is, as the app builds fine on my machine. Error occurs only upon deployment to Heroku.

\n","meta":{"source":"GitHub","url":"https://github.com/async-labs/saas/issues/34"},"_input_hash":-1998415694,"_task_hash":1847504921,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"cmd/go: cannot build go binary on Windows","meta":{"source":"GitHub","url":"https://github.com/golang/go/issues/21169"},"label":"DOCUMENTATION","_input_hash":-1587230324,"_task_hash":-1211733677,"answer":"reject"} {"text":"Decide which tools to use for planning","meta":{"source":"GitHub","url":"https://github.com/chingu-voyage-turtles-team-5/chrome-extension-clone/issues/2"},"label":"DOCUMENTATION","_input_hash":1546340914,"_task_hash":-1631550149,"answer":"reject"} {"text":"WIKI Help Image","meta":{"source":"GitHub","url":"https://github.com/gitterHQ/sidecar/issues/61"},"label":"DOCUMENTATION","_input_hash":210003708,"_task_hash":-628243344,"answer":"accept"} {"text":"# protoLoader.loadSync not finding proto file\n\nHi there, it appears that when executing the code from the readme the protoLoader module is not able to locate the `./protobufs/anki_vector/messaging/external_interface.proto` file.\r\n\r\nHere is the error message I receive:\r\n`Error: ENOENT: no such file or directory, open 'protobufs/anki_vector/messaging/external_interface.proto'`\r\n\r\nI forked the repo to see if I am able to come up with a solution.","title":"protoLoader.loadSync not finding proto file","body":"Hi there, it appears that when executing the code from the readme the protoLoader module is not able to locate the `./protobufs/anki_vector/messaging/external_interface.proto` file.\r\n\r\nHere is the error message I receive:\r\n`Error: ENOENT: no such file or directory, open 'protobufs/anki_vector/messaging/external_interface.proto'`\r\n\r\nI forked the repo to see if I am able to come up with a solution.","html":"

protoLoader.loadSync not finding proto file

\n\n

Hi there, it appears that when executing the code from the readme the protoLoader module is not able to locate the ./protobufs/anki_vector/messaging/external_interface.proto file.

\n\n

Here is the error message I receive:\nError: ENOENT: no such file or directory, open 'protobufs/anki_vector/messaging/external_interface.proto'

\n\n

I forked the repo to see if I am able to come up with a solution.

\n","meta":{"source":"GitHub","url":"https://github.com/KishCom/anki-vector-nodejs/issues/2"},"_input_hash":-901545619,"_task_hash":139724084,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"`composer create-project` fails with \"No such file or directory\"","meta":{"source":"GitHub","url":"https://github.com/phpList/base-distribution/issues/23"},"label":"DOCUMENTATION","_input_hash":210974501,"_task_hash":81241437,"answer":"reject"} {"text":"# Make the extension Antora aware\n\nThe extension should automatically detect an Antora documentation component and resolve external resources (images, partials, examples...).\r\n","title":"Make the extension Antora aware","body":"The extension should automatically detect an Antora documentation component and resolve external resources (images, partials, examples...).\r\n","html":"

Make the extension Antora aware

\n\n

The extension should automatically detect an Antora documentation component and resolve external resources (images, partials, examples...).

\n","meta":{"source":"GitHub","url":"https://github.com/asciidoctor/asciidoctor-browser-extension/issues/299"},"_input_hash":-2011566860,"_task_hash":1524027057,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# Optimize nginx configuration for production based on Laravel documentation\n\nhttps://laravel.com/docs/5.8/deployment","title":"Optimize nginx configuration for production based on Laravel documentation","body":"https://laravel.com/docs/5.8/deployment","html":"

Optimize nginx configuration for production based on Laravel documentation

\n\n

https://laravel.com/docs/5.8/deployment

\n","meta":{"source":"GitHub","url":"https://github.com/necrommunity/necrolab/issues/455"},"_input_hash":-402203195,"_task_hash":-448469179,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# Project Setup Feedback\n\nYour README doesn't describe what your project is. This is the first thing anyone (read: a potential employer) sees when they browse this project. For a first step, copy your project outline into the `README.md`.","title":"Project Setup Feedback","body":"Your README doesn't describe what your project is. This is the first thing anyone (read: a potential employer) sees when they browse this project. For a first step, copy your project outline into the `README.md`.","html":"

Project Setup Feedback

\n\n

Your README doesn't describe what your project is. This is the first thing anyone (read: a potential employer) sees when they browse this project. For a first step, copy your project outline into the README.md.

\n","meta":{"source":"GitHub","url":"https://github.com/achang209/liftoff-assignments/issues/4"},"_input_hash":1421012795,"_task_hash":-1048076036,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Failure to build caffe","meta":{"source":"GitHub","url":"https://github.com/intel/caffe/issues/105"},"label":"DOCUMENTATION","_input_hash":-531953646,"_task_hash":1389320680,"answer":"reject"} {"text":"Badge for readme files","meta":{"source":"GitHub","url":"https://github.com/CompuIves/codesandbox-client/issues/89"},"label":"DOCUMENTATION","_input_hash":-1857910631,"_task_hash":341401567,"answer":"accept"} {"text":"Add empty commit instructions to troubleshooting","meta":{"source":"GitHub","url":"https://github.com/platformsh/platformsh-docs/issues/605"},"label":"DOCUMENTATION","_input_hash":-1371933794,"_task_hash":-1438213268,"answer":"accept"} {"text":"issue creating multiple docker nodes with vmwarefusion driver","meta":{"source":"GitHub","url":"https://github.com/docker/machine/issues/4208"},"label":"DOCUMENTATION","_input_hash":1359133568,"_task_hash":621610618,"answer":"reject"} {"text":"snap fails after \u00b4\u00b4main.go:220: WARNING: cannot create syslog logger","meta":{"source":"GitHub","url":"https://github.com/Microsoft/BashOnWindows/issues/2374"},"label":"DOCUMENTATION","_input_hash":-244547613,"_task_hash":-1847780066,"answer":"reject"} {"text":"# AcceptLoginRequest results in EOF\n\nWhen accepting a login request via hydra, the rest api/ go sdk responds with an Bad Request -> EOF.\r\n\r\nSteps to reproduce the behavior:\r\n\r\n\r\n\r\n*Server response + logs*\r\n\r\nLogs:\r\n\r\n```\r\nhydra_1 | time=\"2019-08-11T09:18:17Z\" level=error msg=\"An error occurred while handling a request\" code=400 debug= details=\"map[]\" error=EOF reason= request-id= status= trace=\"Stack trace: \\ngithub.com/ory/hydra/consent.(*Handler).AcceptLoginRequest\\n\\t/go/src/github.com/ory/hydra/consent/handler.go:320\\ngithub.com/julienschmidt/httprouter.(*Router).ServeHTTP\\n\\t/go/pkg/mod/github.com/julienschmidt/httprouter@v1.2.0/router.go:334\\ngithub.com/urfave/negroni.Wrap.func1\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:46\\ngithub.com/urfave/negroni.HandlerFunc.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:29\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\nnet/http.HandlerFunc.ServeHTTP\\n\\t/usr/local/go/src/net/http/server.go:1995\\ngithub.com/ory/hydra/x.RejectInsecureRequests.func1\\n\\t/go/src/github.com/ory/hydra/x/tls_termination.go:55\\ngithub.com/urfave/negroni.HandlerFunc.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:29\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/ory/x/metricsx.(*Service).ServeHTTP\\n\\t/go/pkg/mod/github.com/ory/x@v0.0.64/metricsx/middleware.go:260\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/ory/hydra/metrics/prometheus.(*MetricsManager).ServeHTTP\\n\\t/go/src/github.com/ory/hydra/metrics/prometheus/middleware.go:26\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/meatballhat/negroni-logrus.(*Middleware).ServeHTTP\\n\\t/go/pkg/mod/github.com/meatballhat/negroni-logrus@v0.0.0-20170801195057-31067281800f/middleware.go:136\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/urfave/negroni.(*Negroni).ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:96\\nnet/http.serverHandler.ServeHTTP\\n\\t/usr/local/go/src/net/http/server.go:2774\\nnet/http.(*conn).serve\\n\\t/usr/local/go/src/net/http/server.go:1878\\nruntime.goexit\\n\\t/usr/local/go/src/runtime/asm_amd64.s:1337\" writer=JSON\r\n```\r\n\r\nResponse\r\n```json\r\n{\r\n \"error\": \"error\",\r\n \"error_description\": \"The error is unrecognizable.\",\r\n \"status_code\": 500,\r\n \"error_debug\": \"EOF\",\r\n \"request_id\": \"\"\r\n}\r\n```\r\n\r\n*Server configuration*\r\n\r\n```yml\r\nversion: \"3.7\"\r\nservices:\r\n dbdev:\r\n image: \"postgres:11\"\r\n ports:\r\n - \"5432:5432\"\r\n environment:\r\n - \"POSTGRES_PASSWORD=postgres\"\r\n - \"POSTGRES_USER=postgres\"\r\n - \"POSTGRES_DB=postgres\"\r\n hydra:\r\n image: \"oryd/hydra:v1.0.0\"\r\n environment:\r\n - \"URLS_SELF_ISSUER=http://localhost:4444\"\r\n - \"URLS_CONSENT=http://localhost:4200/consent\"\r\n - \"URLS_LOGIN=http://localhost:4200/login\"\r\n - \"URLS_LOGOUT=http://localhost:4200/logout\"\r\n - \"DSN=postgres://hola:hola@hydradb:5432/holadb?sslmode=disable\"\r\n - \"SECRETS_SYSTEM=youReallyNeedToChangeThis\"\r\n - \"OIDC_SUBJECT_TYPES_SUPPORTED=public,pairwise\"\r\n - \"OIDC_SUBJECT_TYPE_PAIRWISE_SALT=youReallyNeedToChangeThis\"\r\n ports:\r\n - \"4444:4444\"\r\n - \"4445:4445\"\r\n - \"4446:4446\"\r\n - \"5555:5555\"\r\n command: serve all --dangerous-force-http\r\n depends_on:\r\n - hydra-migrate\r\n hydradb:\r\n image: \"postgres:11\"\r\n environment:\r\n - \"POSTGRES_PASSWORD=hola\"\r\n - \"POSTGRES_USER=hola\"\r\n - \"POSTGRES_DB=holadb\"\r\n ports:\r\n - \"5433:5432\"\r\n hydra-migrate:\r\n image: oryd/hydra:latest\r\n environment:\r\n - \"DSN=postgres://hola:hola@hydradb:5432/holadb?sslmode=disable\"\r\n command:\r\n migrate sql -e --yes\r\n restart: on-failure\r\n```\r\n\r\n**Expected behavior**\r\n\r\nHydra sends the redirectTo Uri as response.\r\n\r\n**Environment**\r\n\r\n* Version: oryd/hydra:v1.0.0\r\n* Environment: MacOS, Docker Desktop 2.1.0.1, ...\r\n\r\n**Additional context**\r\n\r\nA example User Service / Identity Provider should be implemented. The Client sends its email and password, the service should mark the request as accepted via hydra.\r\n","title":"AcceptLoginRequest results in EOF","body":"When accepting a login request via hydra, the rest api/ go sdk responds with an Bad Request -> EOF.\r\n\r\nSteps to reproduce the behavior:\r\n\r\n\r\n\r\n*Server response + logs*\r\n\r\nLogs:\r\n\r\n```\r\nhydra_1 | time=\"2019-08-11T09:18:17Z\" level=error msg=\"An error occurred while handling a request\" code=400 debug= details=\"map[]\" error=EOF reason= request-id= status= trace=\"Stack trace: \\ngithub.com/ory/hydra/consent.(*Handler).AcceptLoginRequest\\n\\t/go/src/github.com/ory/hydra/consent/handler.go:320\\ngithub.com/julienschmidt/httprouter.(*Router).ServeHTTP\\n\\t/go/pkg/mod/github.com/julienschmidt/httprouter@v1.2.0/router.go:334\\ngithub.com/urfave/negroni.Wrap.func1\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:46\\ngithub.com/urfave/negroni.HandlerFunc.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:29\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\nnet/http.HandlerFunc.ServeHTTP\\n\\t/usr/local/go/src/net/http/server.go:1995\\ngithub.com/ory/hydra/x.RejectInsecureRequests.func1\\n\\t/go/src/github.com/ory/hydra/x/tls_termination.go:55\\ngithub.com/urfave/negroni.HandlerFunc.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:29\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/ory/x/metricsx.(*Service).ServeHTTP\\n\\t/go/pkg/mod/github.com/ory/x@v0.0.64/metricsx/middleware.go:260\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/ory/hydra/metrics/prometheus.(*MetricsManager).ServeHTTP\\n\\t/go/src/github.com/ory/hydra/metrics/prometheus/middleware.go:26\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/meatballhat/negroni-logrus.(*Middleware).ServeHTTP\\n\\t/go/pkg/mod/github.com/meatballhat/negroni-logrus@v0.0.0-20170801195057-31067281800f/middleware.go:136\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/urfave/negroni.(*Negroni).ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:96\\nnet/http.serverHandler.ServeHTTP\\n\\t/usr/local/go/src/net/http/server.go:2774\\nnet/http.(*conn).serve\\n\\t/usr/local/go/src/net/http/server.go:1878\\nruntime.goexit\\n\\t/usr/local/go/src/runtime/asm_amd64.s:1337\" writer=JSON\r\n```\r\n\r\nResponse\r\n```json\r\n{\r\n \"error\": \"error\",\r\n \"error_description\": \"The error is unrecognizable.\",\r\n \"status_code\": 500,\r\n \"error_debug\": \"EOF\",\r\n \"request_id\": \"\"\r\n}\r\n```\r\n\r\n*Server configuration*\r\n\r\n```yml\r\nversion: \"3.7\"\r\nservices:\r\n dbdev:\r\n image: \"postgres:11\"\r\n ports:\r\n - \"5432:5432\"\r\n environment:\r\n - \"POSTGRES_PASSWORD=postgres\"\r\n - \"POSTGRES_USER=postgres\"\r\n - \"POSTGRES_DB=postgres\"\r\n hydra:\r\n image: \"oryd/hydra:v1.0.0\"\r\n environment:\r\n - \"URLS_SELF_ISSUER=http://localhost:4444\"\r\n - \"URLS_CONSENT=http://localhost:4200/consent\"\r\n - \"URLS_LOGIN=http://localhost:4200/login\"\r\n - \"URLS_LOGOUT=http://localhost:4200/logout\"\r\n - \"DSN=postgres://hola:hola@hydradb:5432/holadb?sslmode=disable\"\r\n - \"SECRETS_SYSTEM=youReallyNeedToChangeThis\"\r\n - \"OIDC_SUBJECT_TYPES_SUPPORTED=public,pairwise\"\r\n - \"OIDC_SUBJECT_TYPE_PAIRWISE_SALT=youReallyNeedToChangeThis\"\r\n ports:\r\n - \"4444:4444\"\r\n - \"4445:4445\"\r\n - \"4446:4446\"\r\n - \"5555:5555\"\r\n command: serve all --dangerous-force-http\r\n depends_on:\r\n - hydra-migrate\r\n hydradb:\r\n image: \"postgres:11\"\r\n environment:\r\n - \"POSTGRES_PASSWORD=hola\"\r\n - \"POSTGRES_USER=hola\"\r\n - \"POSTGRES_DB=holadb\"\r\n ports:\r\n - \"5433:5432\"\r\n hydra-migrate:\r\n image: oryd/hydra:latest\r\n environment:\r\n - \"DSN=postgres://hola:hola@hydradb:5432/holadb?sslmode=disable\"\r\n command:\r\n migrate sql -e --yes\r\n restart: on-failure\r\n```\r\n\r\n**Expected behavior**\r\n\r\nHydra sends the redirectTo Uri as response.\r\n\r\n**Environment**\r\n\r\n* Version: oryd/hydra:v1.0.0\r\n* Environment: MacOS, Docker Desktop 2.1.0.1, ...\r\n\r\n**Additional context**\r\n\r\nA example User Service / Identity Provider should be implemented. The Client sends its email and password, the service should mark the request as accepted via hydra.\r\n","html":"

AcceptLoginRequest results in EOF

\n\n

When accepting a login request via hydra, the rest api/ go sdk responds with an Bad Request -> EOF.

\n\n

Steps to reproduce the behavior:

\n\n\n\n

Server response + logs

\n\n

Logs:

\n\n

\nhydra_1 | time=\"2019-08-11T09:18:17Z\" level=error msg=\"An error occurred while handling a request\" code=400 debug= details=\"map[]\" error=EOF reason= request-id= status= trace=\"Stack trace: \\ngithub.com/ory/hydra/consent.(*Handler).AcceptLoginRequest\\n\\t/go/src/github.com/ory/hydra/consent/handler.go:320\\ngithub.com/julienschmidt/httprouter.(*Router).ServeHTTP\\n\\t/go/pkg/mod/github.com/julienschmidt/httprouter@v1.2.0/router.go:334\\ngithub.com/urfave/negroni.Wrap.func1\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:46\\ngithub.com/urfave/negroni.HandlerFunc.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:29\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\nnet/http.HandlerFunc.ServeHTTP\\n\\t/usr/local/go/src/net/http/server.go:1995\\ngithub.com/ory/hydra/x.RejectInsecureRequests.func1\\n\\t/go/src/github.com/ory/hydra/x/tls_termination.go:55\\ngithub.com/urfave/negroni.HandlerFunc.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:29\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/ory/x/metricsx.(*Service).ServeHTTP\\n\\t/go/pkg/mod/github.com/ory/x@v0.0.64/metricsx/middleware.go:260\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/ory/hydra/metrics/prometheus.(*MetricsManager).ServeHTTP\\n\\t/go/src/github.com/ory/hydra/metrics/prometheus/middleware.go:26\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/meatballhat/negroni-logrus.(*Middleware).ServeHTTP\\n\\t/go/pkg/mod/github.com/meatballhat/negroni-logrus@v0.0.0-20170801195057-31067281800f/middleware.go:136\\ngithub.com/urfave/negroni.middleware.ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\\ngithub.com/urfave/negroni.(*Negroni).ServeHTTP\\n\\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:96\\nnet/http.serverHandler.ServeHTTP\\n\\t/usr/local/go/src/net/http/server.go:2774\\nnet/http.(*conn).serve\\n\\t/usr/local/go/src/net/http/server.go:1878\\nruntime.goexit\\n\\t/usr/local/go/src/runtime/asm_amd64.s:1337\" writer=JSON\n

\n\n

Response\njson\n{\n \"error\": \"error\",\n \"error_description\": \"The error is unrecognizable.\",\n \"status_code\": 500,\n \"error_debug\": \"EOF\",\n \"request_id\": \"\"\n}\n

\n\n

Server configuration

\n\n

yml\nversion: \"3.7\"\nservices:\n dbdev:\n image: \"postgres:11\"\n ports:\n - \"5432:5432\"\n environment:\n - \"POSTGRES_PASSWORD=postgres\"\n - \"POSTGRES_USER=postgres\"\n - \"POSTGRES_DB=postgres\"\n hydra:\n image: \"oryd/hydra:v1.0.0\"\n environment:\n - \"URLS_SELF_ISSUER=http://localhost:4444\"\n - \"URLS_CONSENT=http://localhost:4200/consent\"\n - \"URLS_LOGIN=http://localhost:4200/login\"\n - \"URLS_LOGOUT=http://localhost:4200/logout\"\n - \"DSN=postgres://hola:hola@hydradb:5432/holadb?sslmode=disable\"\n - \"SECRETS_SYSTEM=youReallyNeedToChangeThis\"\n - \"OIDC_SUBJECT_TYPES_SUPPORTED=public,pairwise\"\n - \"OIDC_SUBJECT_TYPE_PAIRWISE_SALT=youReallyNeedToChangeThis\"\n ports:\n - \"4444:4444\"\n - \"4445:4445\"\n - \"4446:4446\"\n - \"5555:5555\"\n command: serve all --dangerous-force-http\n depends_on:\n - hydra-migrate\n hydradb:\n image: \"postgres:11\"\n environment:\n - \"POSTGRES_PASSWORD=hola\"\n - \"POSTGRES_USER=hola\"\n - \"POSTGRES_DB=holadb\"\n ports:\n - \"5433:5432\"\n hydra-migrate:\n image: oryd/hydra:latest\n environment:\n - \"DSN=postgres://hola:hola@hydradb:5432/holadb?sslmode=disable\"\n command:\n migrate sql -e --yes\n restart: on-failure\n

\n\n

Expected behavior

\n\n

Hydra sends the redirectTo Uri as response.

\n\n

Environment

\n\n
    \n
  • Version: oryd/hydra:v1.0.0
  • \n
  • Environment: MacOS, Docker Desktop 2.1.0.1, ...
  • \n
\n\n

Additional context

\n\n

A example User Service / Identity Provider should be implemented. The Client sends its email and password, the service should mark the request as accepted via hydra.

\n","meta":{"source":"GitHub","url":"https://github.com/ory/hydra/issues/1524"},"_input_hash":217693793,"_task_hash":1283925233,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Feature request: IPV6 Loadbalancing support","meta":{"source":"GitHub","url":"https://github.com/terraform-providers/terraform-provider-google/issues/245"},"label":"DOCUMENTATION","_input_hash":-1714245171,"_task_hash":1194642336,"answer":"reject"} {"text":"Encryption at rest key rotation is not working","meta":{"source":"GitHub","url":"https://github.com/kubernetes/kubernetes/issues/49565"},"label":"DOCUMENTATION","_input_hash":365714381,"_task_hash":-1619983448,"answer":"reject"} {"text":"remark-iframes: incorrect docs","meta":{"source":"GitHub","url":"https://github.com/zestedesavoir/zmarkdown/issues/130"},"label":"DOCUMENTATION","_input_hash":1332914009,"_task_hash":-1788877701,"answer":"accept"} {"text":"Built with grunt","meta":{"source":"GitHub","url":"https://github.com/asaladino/dev-docker/issues/10"},"label":"DOCUMENTATION","_input_hash":-1802533899,"_task_hash":453248439,"answer":"reject"} {"text":"# Windows cannot verify the digital signature of the driver required for this device. A piece of software or hardware has recently changed, and a file with a wrong signature or corruption may be installed, or the installed file may be malware of unknown origin. (Code 52)\n\n_The template below is mostly useful for bug reports and support questions. Feel free to remove anything which doesn't apply to you and add more information where it makes sense._\r\n\r\n_Also, before reporting a new issue, please make sure that:_\r\n\r\n- _You read carefully the [documentation and frequently asked questions](https://github.com/NVIDIA/nvidia-docker/wiki)._\r\n- _You [searched](https://github.com/NVIDIA/nvidia-docker/issues?utf8=%E2%9C%93&q=is%3Aissue) for a similar issue and this is not a duplicate of an existing one._\r\n- _This issue is not related to [NGC](https://github.com/NVIDIA/nvidia-docker/wiki/NGC), otherwise, please use the [devtalk forums](https://devtalk.nvidia.com/default/board/200/nvidia-gpu-cloud-ngc-users/) instead._\r\n- _You went through the [troubleshooting](https://github.com/NVIDIA/nvidia-docker/wiki/Troubleshooting) steps._\r\n\r\n---\r\n\r\n### 1. Issue or feature description\r\n\r\n### 2. Steps to reproduce the issue\r\n\r\n### 3. Information to [attach](https://help.github.com/articles/file-attachments-on-issues-and-pull-requests/) (optional if deemed irrelevant)\r\n\r\n - [ ] Some nvidia-container information: `nvidia-container-cli -k -d /dev/tty info`\r\n - [ ] Kernel version from `uname -a`\r\n - [ ] Any relevant kernel output lines from `dmesg`\r\n - [ ] Driver information from `nvidia-smi -a`\r\n - [ ] Docker version from `docker version`\r\n - [ ] NVIDIA packages version from `dpkg -l '*nvidia*'` _or_ `rpm -qa '*nvidia*'`\r\n - [ ] NVIDIA container library version from `nvidia-container-cli -V`\r\n - [ ] NVIDIA container library logs (see [troubleshooting](https://github.com/NVIDIA/nvidia-docker/wiki/Troubleshooting))\r\n - [ ] Docker command, image and tag used\r\n","title":"Windows cannot verify the digital signature of the driver required for this device. A piece of software or hardware has recently changed, and a file with a wrong signature or corruption may be installed, or the installed file may be malware of unknown origin. (Code 52)","body":"_The template below is mostly useful for bug reports and support questions. Feel free to remove anything which doesn't apply to you and add more information where it makes sense._\r\n\r\n_Also, before reporting a new issue, please make sure that:_\r\n\r\n- _You read carefully the [documentation and frequently asked questions](https://github.com/NVIDIA/nvidia-docker/wiki)._\r\n- _You [searched](https://github.com/NVIDIA/nvidia-docker/issues?utf8=%E2%9C%93&q=is%3Aissue) for a similar issue and this is not a duplicate of an existing one._\r\n- _This issue is not related to [NGC](https://github.com/NVIDIA/nvidia-docker/wiki/NGC), otherwise, please use the [devtalk forums](https://devtalk.nvidia.com/default/board/200/nvidia-gpu-cloud-ngc-users/) instead._\r\n- _You went through the [troubleshooting](https://github.com/NVIDIA/nvidia-docker/wiki/Troubleshooting) steps._\r\n\r\n---\r\n\r\n### 1. Issue or feature description\r\n\r\n### 2. Steps to reproduce the issue\r\n\r\n### 3. Information to [attach](https://help.github.com/articles/file-attachments-on-issues-and-pull-requests/) (optional if deemed irrelevant)\r\n\r\n - [ ] Some nvidia-container information: `nvidia-container-cli -k -d /dev/tty info`\r\n - [ ] Kernel version from `uname -a`\r\n - [ ] Any relevant kernel output lines from `dmesg`\r\n - [ ] Driver information from `nvidia-smi -a`\r\n - [ ] Docker version from `docker version`\r\n - [ ] NVIDIA packages version from `dpkg -l '*nvidia*'` _or_ `rpm -qa '*nvidia*'`\r\n - [ ] NVIDIA container library version from `nvidia-container-cli -V`\r\n - [ ] NVIDIA container library logs (see [troubleshooting](https://github.com/NVIDIA/nvidia-docker/wiki/Troubleshooting))\r\n - [ ] Docker command, image and tag used\r\n","html":"

Windows cannot verify the digital signature of the driver required for this device. A piece of software or hardware has recently changed, and a file with a wrong signature or corruption may be installed, or the installed file may be malware of unknown origin. (Code 52)

\n\n

The template below is mostly useful for bug reports and support questions. Feel free to remove anything which doesn't apply to you and add more information where it makes sense.

\n\n

Also, before reporting a new issue, please make sure that:

\n\n\n\n
\n\n

1. Issue or feature description

\n\n

2. Steps to reproduce the issue

\n\n

3. Information to attach (optional if deemed irrelevant)

\n\n
    \n
  • [ ] Some nvidia-container information: nvidia-container-cli -k -d /dev/tty info
  • \n
  • [ ] Kernel version from uname -a
  • \n
  • [ ] Any relevant kernel output lines from dmesg
  • \n
  • [ ] Driver information from nvidia-smi -a
  • \n
  • [ ] Docker version from docker version
  • \n
  • [ ] NVIDIA packages version from dpkg -l '*nvidia*' or rpm -qa '*nvidia*'
  • \n
  • [ ] NVIDIA container library version from nvidia-container-cli -V
  • \n
  • [ ] NVIDIA container library logs (see troubleshooting)
  • \n
  • [ ] Docker command, image and tag used
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/NVIDIA/nvidia-docker/issues/1045"},"_input_hash":-783761641,"_task_hash":-847300024,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Infinite loop when calling `brew style --fix`","meta":{"source":"GitHub","url":"https://github.com/Homebrew/brew/issues/2948"},"label":"DOCUMENTATION","_input_hash":1584489695,"_task_hash":-612108503,"answer":"reject"} {"text":"saving the file","meta":{"source":"GitHub","url":"https://github.com/learn-co-curriculum/js-from-dom-to-node/issues/15"},"label":"DOCUMENTATION","_input_hash":-1240163366,"_task_hash":635662773,"answer":"reject"} {"text":"Add info to upgrade manual","meta":{"source":"GitHub","url":"https://github.com/Spomky-Labs/otphp/issues/88"},"label":"DOCUMENTATION","_input_hash":-1023719744,"_task_hash":918889695,"answer":"accept"} {"text":"# add to document about android sdk licenses agreement\n\nHhi. To prevent additional searching add to your dicumentation commands to agree android license agreement\r\n\r\nhttps://stackoverflow.com/questions/39760172/you-have-not-accepted-the-license-agreements-of-the-following-sdk-components\r\n\r\n```\r\non Windows:\r\n\r\ncd \"%ANDROID_HOME%\"/tools/bin\r\nRun the sdkmanager as follows:\r\n\r\nsdkmanager --licenses\r\nAnd accept the licenses you did not accept yet (but need to).\r\n\r\nFor more details see the Android Studio documentation, although the current documentation is missing any description on the --licenses option.\r\n```","title":"add to document about android sdk licenses agreement","body":"Hhi. To prevent additional searching add to your dicumentation commands to agree android license agreement\r\n\r\nhttps://stackoverflow.com/questions/39760172/you-have-not-accepted-the-license-agreements-of-the-following-sdk-components\r\n\r\n```\r\non Windows:\r\n\r\ncd \"%ANDROID_HOME%\"/tools/bin\r\nRun the sdkmanager as follows:\r\n\r\nsdkmanager --licenses\r\nAnd accept the licenses you did not accept yet (but need to).\r\n\r\nFor more details see the Android Studio documentation, although the current documentation is missing any description on the --licenses option.\r\n```","html":"

add to document about android sdk licenses agreement

\n\n

Hhi. To prevent additional searching add to your dicumentation commands to agree android license agreement

\n\n

https://stackoverflow.com/questions/39760172/you-have-not-accepted-the-license-agreements-of-the-following-sdk-components

\n\n

```\non Windows:

\n\n

cd \"%ANDROID_HOME%\"/tools/bin\nRun the sdkmanager as follows:

\n\n

sdkmanager --licenses\nAnd accept the licenses you did not accept yet (but need to).

\n\n

For more details see the Android Studio documentation, although the current documentation is missing any description on the --licenses option.\n```

\n","meta":{"source":"GitHub","url":"https://github.com/distriqt/ANE-CustomResources/issues/40"},"_input_hash":-1263163627,"_task_hash":-1185768490,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Update Docs on Security Context and Policy","meta":{"source":"GitHub","url":"https://github.com/kubernetes/kubernetes.github.io/issues/4492"},"label":"DOCUMENTATION","_input_hash":-1601165728,"_task_hash":2094110786,"answer":"accept"} {"text":"Please install GAMS in Codecov testing framework","meta":{"source":"GitHub","url":"https://github.com/Pyomo/pyomo/issues/193"},"label":"DOCUMENTATION","_input_hash":1102293459,"_task_hash":1399198622,"answer":"reject"} {"text":"service documentation links to all nodes need updated.","meta":{"source":"GitHub","url":"https://github.com/watson-developer-cloud/node-red-node-watson/issues/311"},"label":"DOCUMENTATION","_input_hash":-1165153407,"_task_hash":-2105762087,"answer":"accept"} {"text":"# Using useReducer will have `errors` as an empty object.\n\n**Describe the bug**\r\n`errors` is empty if it's coming from `useReducer`\r\n\r\n**To Reproduce**\r\nhttps://codesandbox.io/s/my-app-wuloq\r\n\r\n**Expected behavior**\r\n`errors` before `useReducer` seems to have correct properties. I am expecting to have the same values in errors coming from `state`.\r\n\r\n```tsx\r\n// https://codesandbox.io/s/my-app-wuloq\r\n// App.tsx\r\n\r\n // This has error properties when the form is invalid\r\n console.log(\"coming directly from useForm - errors:\", errors)\r\n // This seems it's always empty\r\n console.log(\"coming from state - errors:\", state.errors)\r\n\r\n```","title":"Using useReducer will have `errors` as an empty object.","body":"**Describe the bug**\r\n`errors` is empty if it's coming from `useReducer`\r\n\r\n**To Reproduce**\r\nhttps://codesandbox.io/s/my-app-wuloq\r\n\r\n**Expected behavior**\r\n`errors` before `useReducer` seems to have correct properties. I am expecting to have the same values in errors coming from `state`.\r\n\r\n```tsx\r\n// https://codesandbox.io/s/my-app-wuloq\r\n// App.tsx\r\n\r\n // This has error properties when the form is invalid\r\n console.log(\"coming directly from useForm - errors:\", errors)\r\n // This seems it's always empty\r\n console.log(\"coming from state - errors:\", state.errors)\r\n\r\n```","html":"

Using useReducer will have errors as an empty object.

\n\n

Describe the bug\nerrors is empty if it's coming from useReducer

\n\n

To Reproduce\nhttps://codesandbox.io/s/my-app-wuloq

\n\n

Expected behavior\nerrors before useReducer seems to have correct properties. I am expecting to have the same values in errors coming from state.

\n\n

```tsx\n// https://codesandbox.io/s/my-app-wuloq\n// App.tsx

\n\n

// This has error properties when the form is invalid\n console.log(\"coming directly from useForm - errors:\", errors)\n // This seems it's always empty\n console.log(\"coming from state - errors:\", state.errors)

\n\n

```

\n","meta":{"source":"GitHub","url":"https://github.com/react-hook-form/react-hook-form/issues/204"},"_input_hash":943749970,"_task_hash":-1248600479,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Router upgrade does not work when initial state is Angular","meta":{"source":"GitHub","url":"https://github.com/angular/angular/issues/18329"},"label":"DOCUMENTATION","_input_hash":-2058074284,"_task_hash":1079221815,"answer":"reject"} {"text":"# Preparing Hass.io .... infinite time\n\nI've just tried to install hass.io on Hetzner Cloud (also tried to install on local vm)\r\n\r\nafter script finished with no error and [Info] Run Hass.io message, i can access web at http://ip.adrress:8123 but all i get is Preparing Hass.io (this can take up to 20 minutes) message.\r\nBoth VMs are fast and even after 4-5 hours there's nothing much happening.\r\n\r\nOS: ubuntu-18.04.3-live-server-amd64, fresh install. Docker-ce is installed as describes here: https://docs.docker.com/install/linux/docker-ce/debian/\r\n\r\n\r\nhave no clue what to do next.","title":"Preparing Hass.io .... infinite time","body":"I've just tried to install hass.io on Hetzner Cloud (also tried to install on local vm)\r\n\r\nafter script finished with no error and [Info] Run Hass.io message, i can access web at http://ip.adrress:8123 but all i get is Preparing Hass.io (this can take up to 20 minutes) message.\r\nBoth VMs are fast and even after 4-5 hours there's nothing much happening.\r\n\r\nOS: ubuntu-18.04.3-live-server-amd64, fresh install. Docker-ce is installed as describes here: https://docs.docker.com/install/linux/docker-ce/debian/\r\n\r\n\r\nhave no clue what to do next.","html":"

Preparing Hass.io .... infinite time

\n\n

I've just tried to install hass.io on Hetzner Cloud (also tried to install on local vm)

\n\n

after script finished with no error and [Info] Run Hass.io message, i can access web at http://ip.adrress:8123 but all i get is Preparing Hass.io (this can take up to 20 minutes) message.\nBoth VMs are fast and even after 4-5 hours there's nothing much happening.

\n\n

OS: ubuntu-18.04.3-live-server-amd64, fresh install. Docker-ce is installed as describes here: https://docs.docker.com/install/linux/docker-ce/debian/

\n\n

have no clue what to do next.

\n","meta":{"source":"GitHub","url":"https://github.com/home-assistant/hassio-installer/issues/31"},"_input_hash":49363284,"_task_hash":-445110118,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"PortalScripting typings are missing constructor for ScriptCall","meta":{"source":"GitHub","url":"https://github.com/otris/vscode-janus-debug/issues/98"},"label":"DOCUMENTATION","_input_hash":1045098939,"_task_hash":32357809,"answer":"reject"} {"text":"[Feature Request] more protocols for links","meta":{"source":"GitHub","url":"https://github.com/turnermm/ckgdoku/issues/23"},"label":"DOCUMENTATION","_input_hash":-75023385,"_task_hash":1712018104,"answer":"reject"} {"text":"\"499 Client Error: Client Disconnected for url: https://upload.pypi.org/legacy/\"","meta":{"source":"GitHub","url":"https://github.com/pypa/pypi-legacy/issues/675"},"label":"DOCUMENTATION","_input_hash":-1079207912,"_task_hash":-286082400,"answer":"reject"} {"text":"# Enquiry about RepositoryFiles section from the node-gitlab package\n\nHi!\r\nI use node-gitlab package (Gitlab API Node.js client) in order to get a file from a gitlab repository . I followed what exists in the official documentation concerning this one (https://www.npmjs.com/package/node-gitlab) and I tried this function (see the capture below) but I received this error during the execution. \r\nAny idea about how to resolve this?\r\nThanks in advance!\r\n\r\nHere is the code that I tried: [\r\n![Capture](https://user-images.githubusercontent.com/36052172/62827552-99d16100-bbc8-11e9-9e1a-0bde8e427b7e.PNG)\r\n](url)\r\n\r\nThe result that I supposed to get: \r\n![Capture1](https://user-images.githubusercontent.com/36052172/62827558-d4d39480-bbc8-11e9-9c13-35fcf0d963a4.PNG)\r\n\r\nThe output: \r\n![Capture2](https://user-images.githubusercontent.com/36052172/62827566-15331280-bbc9-11e9-8666-a7a2d5c0f8a9.PNG)\r\n\r\nThe function : \r\n![Capture3](https://user-images.githubusercontent.com/36052172/62827584-9db1b300-bbc9-11e9-8639-12de2e4a7211.PNG)\r\n\r\n\r\n\r\n","title":"Enquiry about RepositoryFiles section from the node-gitlab package","body":"Hi!\r\nI use node-gitlab package (Gitlab API Node.js client) in order to get a file from a gitlab repository . I followed what exists in the official documentation concerning this one (https://www.npmjs.com/package/node-gitlab) and I tried this function (see the capture below) but I received this error during the execution. \r\nAny idea about how to resolve this?\r\nThanks in advance!\r\n\r\nHere is the code that I tried: [\r\n![Capture](https://user-images.githubusercontent.com/36052172/62827552-99d16100-bbc8-11e9-9e1a-0bde8e427b7e.PNG)\r\n](url)\r\n\r\nThe result that I supposed to get: \r\n![Capture1](https://user-images.githubusercontent.com/36052172/62827558-d4d39480-bbc8-11e9-9c13-35fcf0d963a4.PNG)\r\n\r\nThe output: \r\n![Capture2](https://user-images.githubusercontent.com/36052172/62827566-15331280-bbc9-11e9-8666-a7a2d5c0f8a9.PNG)\r\n\r\nThe function : \r\n![Capture3](https://user-images.githubusercontent.com/36052172/62827584-9db1b300-bbc9-11e9-8639-12de2e4a7211.PNG)\r\n\r\n\r\n\r\n","html":"

Enquiry about RepositoryFiles section from the node-gitlab package

\n\n

Hi!\nI use node-gitlab package (Gitlab API Node.js client) in order to get a file from a gitlab repository . I followed what exists in the official documentation concerning this one (https://www.npmjs.com/package/node-gitlab) and I tried this function (see the capture below) but I received this error during the execution. \nAny idea about how to resolve this?\nThanks in advance!

\n\n

Here is the code that I tried: \n\"Capture\"\n

\n\n

The result that I supposed to get: \n\"Capture1\"

\n\n

The output: \n\"Capture2\"

\n\n

The function : \n\"Capture3\"

\n","meta":{"source":"GitHub","url":"https://github.com/repo-utils/gitlab/issues/68"},"_input_hash":1496482536,"_task_hash":-1737428603,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# 404 when click to Edit on the Multidimensional Modeling (Adventure Works Tutorial)\n\n**\ud83d\uded1 IMPORTANT**: You can get your feedback addressed faster if you **use the comment section for the article in which you encountered a problem**.\r\n\r\n**Link to article:**\r\nhttps://docs.microsoft.com/en-us/analysis-services/multidimensional-tutorial/multidimensional-modeling-adventure-works-tutorial?view=sql-server-2017\r\n\r\n**Problem:**\r\nAfter signing in and click on the Edit button. It return 404 page not found. \r\n","title":"404 when click to Edit on the Multidimensional Modeling (Adventure Works Tutorial)","body":"**\ud83d\uded1 IMPORTANT**: You can get your feedback addressed faster if you **use the comment section for the article in which you encountered a problem**.\r\n\r\n**Link to article:**\r\nhttps://docs.microsoft.com/en-us/analysis-services/multidimensional-tutorial/multidimensional-modeling-adventure-works-tutorial?view=sql-server-2017\r\n\r\n**Problem:**\r\nAfter signing in and click on the Edit button. It return 404 page not found. \r\n","html":"

404 when click to Edit on the Multidimensional Modeling (Adventure Works Tutorial)

\n\n

\ud83d\uded1 IMPORTANT: You can get your feedback addressed faster if you use the comment section for the article in which you encountered a problem.

\n\n

Link to article:\nhttps://docs.microsoft.com/en-us/analysis-services/multidimensional-tutorial/multidimensional-modeling-adventure-works-tutorial?view=sql-server-2017

\n\n

Problem:\nAfter signing in and click on the Edit button. It return 404 page not found.

\n","meta":{"source":"GitHub","url":"https://github.com/MicrosoftDocs/feedback/issues/1826"},"_input_hash":-1096452978,"_task_hash":1364217161,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# CVE-2018-11697 (High) detected in CSS::Sass-v3.6.0\n\n## CVE-2018-11697 - High Severity Vulnerability\n
Vulnerable Library - CSS::Sassv3.6.0

\n

\n\n

Library home page: https://metacpan.org/pod/CSS::Sass

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Library Source Files (63)\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/unchecked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/output.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/cencode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/base.h\n - /website/docs/node_modules/node-sass/src/libsass/src/json.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/position.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp\n - /website/docs/node_modules/node-sass/src/libsass/contrib/plugin.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/core.h\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/functions.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_superselector.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cencode.c\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/base64vlq.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/c99func.c\n - /website/docs/node_modules/node-sass/src/libsass/src/position.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/values.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass2scss.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/paths.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/context.h\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_unification.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.cpp\n - /website/docs/node_modules/node-sass/src/libsass/script/test-leaks.pl\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/json.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/encode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/checked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debug.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass2scss.h\n

\n
\n

\n

\n\n

\n
Vulnerability Details\n

\n \nAn issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::exactly() which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.\n\n

Publish Date: 2018-06-04\n

URL: CVE-2018-11697

\n

\n
\n

\n
CVSS 3 Score Details (8.1)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: None\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","title":"CVE-2018-11697 (High) detected in CSS::Sass-v3.6.0","body":"## CVE-2018-11697 - High Severity Vulnerability\n
Vulnerable Library - CSS::Sassv3.6.0

\n

\n\n

Library home page: https://metacpan.org/pod/CSS::Sass

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Library Source Files (63)\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/unchecked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/output.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/cencode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/base.h\n - /website/docs/node_modules/node-sass/src/libsass/src/json.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/position.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp\n - /website/docs/node_modules/node-sass/src/libsass/contrib/plugin.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/core.h\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/functions.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_superselector.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cencode.c\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/base64vlq.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/c99func.c\n - /website/docs/node_modules/node-sass/src/libsass/src/position.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/values.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass2scss.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/paths.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/context.h\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_unification.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.cpp\n - /website/docs/node_modules/node-sass/src/libsass/script/test-leaks.pl\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/json.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/encode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/checked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debug.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass2scss.h\n

\n
\n

\n

\n\n

\n
Vulnerability Details\n

\n \nAn issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::exactly() which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.\n\n

Publish Date: 2018-06-04\n

URL: CVE-2018-11697

\n

\n
\n

\n
CVSS 3 Score Details (8.1)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: None\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","html":"

CVE-2018-11697 (High) detected in CSS::Sass-v3.6.0

\n\n

CVE-2018-11697 - High Severity Vulnerability

\n\n

Vulnerable Library - CSS::Sassv3.6.0

\n\n

\n\n

Library home page: https://metacpan.org/pod/CSS::Sass

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n\n

\n

\n
Library Source Files (63)

\n\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/unchecked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/output.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/cencode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/base.h\n - /website/docs/node_modules/node-sass/src/libsass/src/json.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/position.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp\n - /website/docs/node_modules/node-sass/src/libsass/contrib/plugin.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/core.h\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/functions.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_superselector.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cencode.c\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/base64vlq.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/c99func.c\n - /website/docs/node_modules/node-sass/src/libsass/src/position.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/values.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass2scss.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/paths.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/context.h\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_unification.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.cpp\n - /website/docs/node_modules/node-sass/src/libsass/script/test-leaks.pl\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/json.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/encode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/checked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debug.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass2scss.h\n

\n\n

\n\n

\n

\n\n

\n\n

\n
Vulnerability Details\n

\n\nAn issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::exactly() which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.\n\n

Publish Date: 2018-06-04\n

URL: CVE-2018-11697

\n

\n\n

\n\n

\n
CVSS 3 Score Details (8.1)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: None\n - Availability Impact: High\n

\n\n

For more information on CVSS3 Scores, click here.\n

\n

\n\n

\n\n
\n\n

Step up your Open Source Security Game with WhiteSource here

\n","meta":{"source":"GitHub","url":"https://github.com/mixcore/website/issues/21"},"_input_hash":1112675827,"_task_hash":-975140188,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Need an issue template","meta":{"source":"GitHub","url":"https://github.com/GenisysProPlugins/PluginRequests/issues/1"},"label":"DOCUMENTATION","_input_hash":-2043512139,"_task_hash":-163952167,"answer":"reject"} {"text":"Compilation problem with dependency Vigra","meta":{"source":"GitHub","url":"https://github.com/deeplearningais/curfil/issues/18"},"label":"DOCUMENTATION","_input_hash":-1159219397,"_task_hash":-1167468875,"answer":"reject"} {"text":"# CVE-2018-19827 (High) detected in CSS::Sass-v3.6.0\n\n## CVE-2018-19827 - High Severity Vulnerability\n
Vulnerable Library - CSS::Sassv3.6.0

\n

\n\n

Library home page: https://metacpan.org/pod/CSS::Sass

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Library Source Files (63)\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/unchecked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/output.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/cencode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/base.h\n - /website/docs/node_modules/node-sass/src/libsass/src/json.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/position.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp\n - /website/docs/node_modules/node-sass/src/libsass/contrib/plugin.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/core.h\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/functions.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_superselector.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cencode.c\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/base64vlq.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/c99func.c\n - /website/docs/node_modules/node-sass/src/libsass/src/position.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/values.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass2scss.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/paths.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/context.h\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_unification.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.cpp\n - /website/docs/node_modules/node-sass/src/libsass/script/test-leaks.pl\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/json.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/encode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/checked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debug.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass2scss.h\n

\n
\n

\n

\n\n

\n
Vulnerability Details\n

\n \nIn LibSass 3.5.5, a use-after-free vulnerability exists in the SharedPtr class in SharedPtr.cpp (or SharedPtr.hpp) that may cause a denial of service (application crash) or possibly have unspecified other impact.\n\n

Publish Date: 2018-12-03\n

URL: CVE-2018-19827

\n

\n
\n

\n
CVSS 3 Score Details (8.8)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: High\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","title":"CVE-2018-19827 (High) detected in CSS::Sass-v3.6.0","body":"## CVE-2018-19827 - High Severity Vulnerability\n
Vulnerable Library - CSS::Sassv3.6.0

\n

\n\n

Library home page: https://metacpan.org/pod/CSS::Sass

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Library Source Files (63)\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/unchecked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/output.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/cencode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/base.h\n - /website/docs/node_modules/node-sass/src/libsass/src/json.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/position.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp\n - /website/docs/node_modules/node-sass/src/libsass/contrib/plugin.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/core.h\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/functions.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_superselector.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cencode.c\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/base64vlq.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/c99func.c\n - /website/docs/node_modules/node-sass/src/libsass/src/position.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/values.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass2scss.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/paths.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/context.h\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_unification.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.cpp\n - /website/docs/node_modules/node-sass/src/libsass/script/test-leaks.pl\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/json.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/encode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/checked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debug.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass2scss.h\n

\n
\n

\n

\n\n

\n
Vulnerability Details\n

\n \nIn LibSass 3.5.5, a use-after-free vulnerability exists in the SharedPtr class in SharedPtr.cpp (or SharedPtr.hpp) that may cause a denial of service (application crash) or possibly have unspecified other impact.\n\n

Publish Date: 2018-12-03\n

URL: CVE-2018-19827

\n

\n
\n

\n
CVSS 3 Score Details (8.8)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: High\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","html":"

CVE-2018-19827 (High) detected in CSS::Sass-v3.6.0

\n\n

CVE-2018-19827 - High Severity Vulnerability

\n\n

Vulnerable Library - CSS::Sassv3.6.0

\n\n

\n\n

Library home page: https://metacpan.org/pod/CSS::Sass

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n\n

\n

\n
Library Source Files (63)

\n\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/unchecked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/output.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/cencode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/base.h\n - /website/docs/node_modules/node-sass/src/libsass/src/json.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/position.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp\n - /website/docs/node_modules/node-sass/src/libsass/contrib/plugin.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/core.h\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/functions.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_superselector.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cencode.c\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/base64vlq.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/c99func.c\n - /website/docs/node_modules/node-sass/src/libsass/src/position.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/values.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass2scss.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/paths.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/context.h\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_unification.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.cpp\n - /website/docs/node_modules/node-sass/src/libsass/script/test-leaks.pl\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/json.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/encode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/checked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debug.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass2scss.h\n

\n\n

\n\n

\n

\n\n

\n\n

\n
Vulnerability Details\n

\n\nIn LibSass 3.5.5, a use-after-free vulnerability exists in the SharedPtr class in SharedPtr.cpp (or SharedPtr.hpp) that may cause a denial of service (application crash) or possibly have unspecified other impact.\n\n

Publish Date: 2018-12-03\n

URL: CVE-2018-19827

\n

\n\n

\n\n

\n
CVSS 3 Score Details (8.8)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: High\n - Availability Impact: High\n

\n\n

For more information on CVSS3 Scores, click here.\n

\n

\n\n

\n\n
\n\n

Step up your Open Source Security Game with WhiteSource here

\n","meta":{"source":"GitHub","url":"https://github.com/mixcore/website/issues/23"},"_input_hash":-1845372771,"_task_hash":-1642688807,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Please add \"remove_tags\" support to CreateOrUpdateSubscriber","meta":{"source":"GitHub","url":"https://github.com/DripEmail/drip-dot-net/issues/12"},"label":"DOCUMENTATION","_input_hash":19240968,"_task_hash":-526522679,"answer":"reject"} {"text":"# Migrating examples from CodeSandbox to the examples folder\n\n\r\n\r\nThere's a note at the top of the [Examples page](https://final-form.org/docs/react-final-form/examples) that calls for help in moving examples from CodeSandbox to the [examples folder](https://github.com/final-form/react-final-form/tree/master/examples).\r\nI would like to help out with this but I'm not quite sure what is required for migration. I can see that the files in each example's folder match those in the respective CodeSandbox, but the README of each folder contains broken links to the respective sandboxes; \"Could not find package.json\" is the error message.\r\n\r\nIt appears that you want the files to be copied from the CodeSandbox to a corresponding subfolder in the examples folder, but I want to be sure before doing anything. I also don't know which CodeSandbox link should be provided in the README for each subfolder, given that those in the current folders are broken as I mentioned earlier. Could you please provide some clarification or a little guide explaining what you want to see in each PR? Thanks.","title":"Migrating examples from CodeSandbox to the examples folder","body":"\r\n\r\nThere's a note at the top of the [Examples page](https://final-form.org/docs/react-final-form/examples) that calls for help in moving examples from CodeSandbox to the [examples folder](https://github.com/final-form/react-final-form/tree/master/examples).\r\nI would like to help out with this but I'm not quite sure what is required for migration. I can see that the files in each example's folder match those in the respective CodeSandbox, but the README of each folder contains broken links to the respective sandboxes; \"Could not find package.json\" is the error message.\r\n\r\nIt appears that you want the files to be copied from the CodeSandbox to a corresponding subfolder in the examples folder, but I want to be sure before doing anything. I also don't know which CodeSandbox link should be provided in the README for each subfolder, given that those in the current folders are broken as I mentioned earlier. Could you please provide some clarification or a little guide explaining what you want to see in each PR? Thanks.","html":"

Migrating examples from CodeSandbox to the examples folder

\n\n\n\n

There's a note at the top of the Examples page that calls for help in moving examples from CodeSandbox to the examples folder.\nI would like to help out with this but I'm not quite sure what is required for migration. I can see that the files in each example's folder match those in the respective CodeSandbox, but the README of each folder contains broken links to the respective sandboxes; \"Could not find package.json\" is the error message.

\n\n

It appears that you want the files to be copied from the CodeSandbox to a corresponding subfolder in the examples folder, but I want to be sure before doing anything. I also don't know which CodeSandbox link should be provided in the README for each subfolder, given that those in the current folders are broken as I mentioned earlier. Could you please provide some clarification or a little guide explaining what you want to see in each PR? Thanks.

\n","meta":{"source":"GitHub","url":"https://github.com/final-form/react-final-form/issues/585"},"_input_hash":427346485,"_task_hash":1612426605,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# CVE-2018-16487 (High) detected in lodash-1.0.2.tgz\n\n## CVE-2018-16487 - High Severity Vulnerability\n
Vulnerable Library - lodash-1.0.2.tgz

\n\n

A utility library delivering consistency, customization, performance, and extras.

\n

Library home page: https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz

\n

Path to dependency file: /website/docs/package.json

\n

Path to vulnerable library: /tmp/git/website/docs/node_modules/lodash/package.json

\n

\n\nDependency Hierarchy:\n - gulp-3.9.1.tgz (Root Library)\n - vinyl-fs-0.3.14.tgz\n - glob-watcher-0.0.6.tgz\n - gaze-0.5.2.tgz\n - globule-0.1.0.tgz\n - :x: **lodash-1.0.2.tgz** (Vulnerable Library)\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Vulnerability Details\n

\n \nA prototype pollution vulnerability was found in lodash <4.17.11 where the functions merge, mergeWith, and defaultsDeep can be tricked into adding or modifying properties of Object.prototype.\n\n

Publish Date: 2019-02-01\n

URL: CVE-2018-16487

\n

\n
\n

\n
CVSS 3 Score Details (9.8)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: None\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: High\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n
Suggested Fix\n

\n\n

Type: Upgrade version

\n

Origin: https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-16487

\n

Release Date: 2019-02-01

\n

Fix Resolution: 4.17.11

\n\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","title":"CVE-2018-16487 (High) detected in lodash-1.0.2.tgz","body":"## CVE-2018-16487 - High Severity Vulnerability\n
Vulnerable Library - lodash-1.0.2.tgz

\n\n

A utility library delivering consistency, customization, performance, and extras.

\n

Library home page: https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz

\n

Path to dependency file: /website/docs/package.json

\n

Path to vulnerable library: /tmp/git/website/docs/node_modules/lodash/package.json

\n

\n\nDependency Hierarchy:\n - gulp-3.9.1.tgz (Root Library)\n - vinyl-fs-0.3.14.tgz\n - glob-watcher-0.0.6.tgz\n - gaze-0.5.2.tgz\n - globule-0.1.0.tgz\n - :x: **lodash-1.0.2.tgz** (Vulnerable Library)\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Vulnerability Details\n

\n \nA prototype pollution vulnerability was found in lodash <4.17.11 where the functions merge, mergeWith, and defaultsDeep can be tricked into adding or modifying properties of Object.prototype.\n\n

Publish Date: 2019-02-01\n

URL: CVE-2018-16487

\n

\n
\n

\n
CVSS 3 Score Details (9.8)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: None\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: High\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n
Suggested Fix\n

\n\n

Type: Upgrade version

\n

Origin: https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-16487

\n

Release Date: 2019-02-01

\n

Fix Resolution: 4.17.11

\n\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","html":"

CVE-2018-16487 (High) detected in lodash-1.0.2.tgz

\n\n

CVE-2018-16487 - High Severity Vulnerability

\n\n

Vulnerable Library - lodash-1.0.2.tgz

\n\n

A utility library delivering consistency, customization, performance, and extras.

\n

Library home page: https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz

\n

Path to dependency file: /website/docs/package.json

\n

Path to vulnerable library: /tmp/git/website/docs/node_modules/lodash/package.json

\n

\n\nDependency Hierarchy:\n - gulp-3.9.1.tgz (Root Library)\n - vinyl-fs-0.3.14.tgz\n - glob-watcher-0.0.6.tgz\n - gaze-0.5.2.tgz\n - globule-0.1.0.tgz\n - :x: **lodash-1.0.2.tgz** (Vulnerable Library)\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n\n

\n\n

\n
Vulnerability Details\n

\n\nA prototype pollution vulnerability was found in lodash <4.17.11 where the functions merge, mergeWith, and defaultsDeep can be tricked into adding or modifying properties of Object.prototype.\n\n

Publish Date: 2019-02-01\n

URL: CVE-2018-16487

\n

\n\n

\n\n

\n
CVSS 3 Score Details (9.8)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: None\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: High\n - Availability Impact: High\n

\n\n

For more information on CVSS3 Scores, click here.\n

\n

\n\n

\n
Suggested Fix\n

\n\n

Type: Upgrade version

\n

Origin: https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-16487

\n

Release Date: 2019-02-01

\n

Fix Resolution: 4.17.11

\n\n

\n\n

\n\n

\n\n
\n\n

Step up your Open Source Security Game with WhiteSource here

\n","meta":{"source":"GitHub","url":"https://github.com/mixcore/website/issues/2"},"_input_hash":-1802498710,"_task_hash":-278399929,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Cannot find libzmq in linux","meta":{"source":"GitHub","url":"https://github.com/bastibe/transplant/issues/44"},"label":"DOCUMENTATION","_input_hash":1191765733,"_task_hash":-76652099,"answer":"reject"} {"text":"Maven Dependency","meta":{"source":"GitHub","url":"https://github.com/Azure/mmlspark/issues/85"},"label":"DOCUMENTATION","_input_hash":-1881636457,"_task_hash":-186896603,"answer":"reject"} {"text":"Make it easier to use grafanalib in a project","meta":{"source":"GitHub","url":"https://github.com/weaveworks/grafanalib/issues/57"},"label":"DOCUMENTATION","_input_hash":1491208061,"_task_hash":-1166719389,"answer":"reject"} {"text":"Documentation of HttpServer.bind and HttpServer.bindSecure is misleading","meta":{"source":"GitHub","url":"https://github.com/dart-lang/sdk/issues/30277"},"label":"DOCUMENTATION","_input_hash":1666168821,"_task_hash":-173948191,"answer":"accept"} {"text":"Double precision emulation using 2 floats (double-float)","meta":{"source":"GitHub","url":"https://github.com/arrayfire/arrayfire/issues/1886"},"label":"DOCUMENTATION","_input_hash":1593641572,"_task_hash":-1673067609,"answer":"reject"} {"text":"Unable to run rook with dataDirHostPath","meta":{"source":"GitHub","url":"https://github.com/rook/rook/issues/839"},"label":"DOCUMENTATION","_input_hash":1903312202,"_task_hash":1389009989,"answer":"reject"} {"text":"Can't understand what is this page saying....","meta":{"source":"GitHub","url":"https://github.com/circleci/circleci-docs/issues/1347"},"label":"DOCUMENTATION","_input_hash":2080765975,"_task_hash":1159120610,"answer":"accept"} {"text":"[Feature Request] LaTeX with Chinese support","meta":{"source":"GitHub","url":"https://github.com/hyrious/Telegram.Bot/issues/2"},"label":"DOCUMENTATION","_input_hash":1316367338,"_task_hash":178764195,"answer":"reject"} {"text":"Add an FAQ page to the website","meta":{"source":"GitHub","url":"https://github.com/lagom/lagom.github.io/issues/90"},"label":"DOCUMENTATION","_input_hash":396024996,"_task_hash":-638364489,"answer":"accept"} {"text":"The connector can silently fail to connect to the requested database or schema","meta":{"source":"GitHub","url":"https://github.com/snowflakedb/snowflake-connector-python/issues/26"},"label":"DOCUMENTATION","_input_hash":54867438,"_task_hash":473821788,"answer":"reject"} {"text":"# Congratulations!\n\n## Nice work\n\n![celebrate](https://octodex.github.com/images/benevocats.jpg)\n\nCongratulations @anesta95, you've completed this course!\n\nWhen considering the security of your repository, consider the installed applications, like me. Every app installed on your repository has access to some of your data. Even if it is harmless (like me), it is a good idea to periodically check and prune the list of installed apps and integrations on your repositories. Look for things like active use, or permissions giving more access than necessary.\n\n### Manage app permissions\n\nAs much as it pains me to leave you, I want you to uninstall me from this repository. I won't be able to congratulate you on achieving this task, but know I'm excited about your progress.\n\nFollow the guidelines in [GitHub's documentation](https://help.github.com/articles/reviewing-your-authorized-integrations/#reviewing-your-authorized-github-apps) to review authorized OAuth and GitHub Apps. If you'd like to practice, you can uninstall Learning Lab from this repository.\n\n### What went well\n\nBefore I say good-bye, here's a recap of all the tasks you've accomplished in your repository:\n\n- Enable vulnerable dependency detection for private repositories\n- Detect and fix outdated dependencies with security vulnerabilities\n- Keep sensitive data out of your repository by leveraging the use of a `.gitignore` file\n\n### What's next?\n\nWant to learn more options to secure your repository? Check out the [documentation for security alerts](https://help.github.com/articles/about-security-alerts-for-vulnerable-dependencies/), as well as some [GitHub apps for security](https://github.com/marketplace/category/security) that might help you keep your code safe.\n\n### Keep Learning\n\nWant to keep learning? Feel free to [check out our other courses](https://lab.github.com/courses).\n\n
\n

I won't respond to this issue, go ahead and close it when finished.

\n","title":"Congratulations!","body":"## Nice work\n\n![celebrate](https://octodex.github.com/images/benevocats.jpg)\n\nCongratulations @anesta95, you've completed this course!\n\nWhen considering the security of your repository, consider the installed applications, like me. Every app installed on your repository has access to some of your data. Even if it is harmless (like me), it is a good idea to periodically check and prune the list of installed apps and integrations on your repositories. Look for things like active use, or permissions giving more access than necessary.\n\n### Manage app permissions\n\nAs much as it pains me to leave you, I want you to uninstall me from this repository. I won't be able to congratulate you on achieving this task, but know I'm excited about your progress.\n\nFollow the guidelines in [GitHub's documentation](https://help.github.com/articles/reviewing-your-authorized-integrations/#reviewing-your-authorized-github-apps) to review authorized OAuth and GitHub Apps. If you'd like to practice, you can uninstall Learning Lab from this repository.\n\n### What went well\n\nBefore I say good-bye, here's a recap of all the tasks you've accomplished in your repository:\n\n- Enable vulnerable dependency detection for private repositories\n- Detect and fix outdated dependencies with security vulnerabilities\n- Keep sensitive data out of your repository by leveraging the use of a `.gitignore` file\n\n### What's next?\n\nWant to learn more options to secure your repository? Check out the [documentation for security alerts](https://help.github.com/articles/about-security-alerts-for-vulnerable-dependencies/), as well as some [GitHub apps for security](https://github.com/marketplace/category/security) that might help you keep your code safe.\n\n### Keep Learning\n\nWant to keep learning? Feel free to [check out our other courses](https://lab.github.com/courses).\n\n
\n

I won't respond to this issue, go ahead and close it when finished.

\n","html":"

Congratulations!

\n\n

Nice work

\n\n

\"celebrate\"

\n\n

Congratulations @anesta95, you've completed this course!

\n\n

When considering the security of your repository, consider the installed applications, like me. Every app installed on your repository has access to some of your data. Even if it is harmless (like me), it is a good idea to periodically check and prune the list of installed apps and integrations on your repositories. Look for things like active use, or permissions giving more access than necessary.

\n\n

Manage app permissions

\n\n

As much as it pains me to leave you, I want you to uninstall me from this repository. I won't be able to congratulate you on achieving this task, but know I'm excited about your progress.

\n\n

Follow the guidelines in GitHub's documentation to review authorized OAuth and GitHub Apps. If you'd like to practice, you can uninstall Learning Lab from this repository.

\n\n

What went well

\n\n

Before I say good-bye, here's a recap of all the tasks you've accomplished in your repository:

\n\n
    \n
  • Enable vulnerable dependency detection for private repositories
  • \n
  • Detect and fix outdated dependencies with security vulnerabilities
  • \n
  • Keep sensitive data out of your repository by leveraging the use of a .gitignore file
  • \n
\n\n

What's next?

\n\n

Want to learn more options to secure your repository? Check out the documentation for security alerts, as well as some GitHub apps for security that might help you keep your code safe.

\n\n

Keep Learning

\n\n

Want to keep learning? Feel free to check out our other courses.

\n\n
\n\n

I won't respond to this issue, go ahead and close it when finished.

\n","meta":{"source":"GitHub","url":"https://github.com/anesta95/security-on-github/issues/5"},"_input_hash":-1733618728,"_task_hash":-1852088553,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# CVE-2019-6286 (Medium) detected in opennms-opennms-source-23.0.0-1\n\n## CVE-2019-6286 - Medium Severity Vulnerability\n
Vulnerable Library - opennmsopennms-source-23.0.0-1

\n

\n\n

A Java based fault and performance management system

\n

Library home page: https://sourceforge.net/projects/opennms/

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Library Source Files (62)\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n

\n
\n

\n

\n\n

\n
Vulnerability Details\n

\n \nIn LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::skip_over_scopes in prelexer.hpp when called from Sass::Parser::parse_import(), a similar issue to CVE-2018-11693.\n\n

Publish Date: 2019-01-14\n

URL: CVE-2019-6286

\n

\n
\n

\n
CVSS 3 Score Details (6.5)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n
Suggested Fix\n

\n\n

Type: Upgrade version

\n

Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286

\n

Release Date: 2019-08-06

\n

Fix Resolution: 3.6.0

\n\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","title":"CVE-2019-6286 (Medium) detected in opennms-opennms-source-23.0.0-1","body":"## CVE-2019-6286 - Medium Severity Vulnerability\n
Vulnerable Library - opennmsopennms-source-23.0.0-1

\n

\n\n

A Java based fault and performance management system

\n

Library home page: https://sourceforge.net/projects/opennms/

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n
\n

\n
Library Source Files (62)\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n

\n
\n

\n

\n\n

\n
Vulnerability Details\n

\n \nIn LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::skip_over_scopes in prelexer.hpp when called from Sass::Parser::parse_import(), a similar issue to CVE-2018-11693.\n\n

Publish Date: 2019-01-14\n

URL: CVE-2019-6286

\n

\n
\n

\n
CVSS 3 Score Details (6.5)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n

\nFor more information on CVSS3 Scores, click here.\n

\n
\n

\n
Suggested Fix\n

\n\n

Type: Upgrade version

\n

Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286

\n

Release Date: 2019-08-06

\n

Fix Resolution: 3.6.0

\n\n

\n
\n

\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","html":"

CVE-2019-6286 (Medium) detected in opennms-opennms-source-23.0.0-1

\n\n

CVE-2019-6286 - Medium Severity Vulnerability

\n\n

Vulnerable Library - opennmsopennms-source-23.0.0-1

\n\n

\n\n

A Java based fault and performance management system

\n

Library home page: https://sourceforge.net/projects/opennms/

\n

Found in HEAD commit: eeefb98d520629c182c4d88691216d2bd738678a

\n

\n\n

\n

\n
Library Source Files (62)

\n\n

\n

* The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.

\n

\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n

\n\n

\n\n

\n

\n\n

\n\n

\n
Vulnerability Details\n

\n\nIn LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::skip_over_scopes in prelexer.hpp when called from Sass::Parser::parse_import(), a similar issue to CVE-2018-11693.\n\n

Publish Date: 2019-01-14\n

URL: CVE-2019-6286

\n

\n\n

\n\n

\n
CVSS 3 Score Details (6.5)\n

\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n

\n\n

For more information on CVSS3 Scores, click here.\n

\n

\n\n

\n
Suggested Fix\n

\n\n

Type: Upgrade version

\n

Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286

\n

Release Date: 2019-08-06

\n

Fix Resolution: 3.6.0

\n\n

\n\n

\n\n

\n\n
\n\n

Step up your Open Source Security Game with WhiteSource here

\n","meta":{"source":"GitHub","url":"https://github.com/mixcore/website/issues/19"},"_input_hash":-1026448287,"_task_hash":1703842994,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"[todo-mvvm-databinding] Task class is not actually immutable (despite docs claim)","meta":{"source":"GitHub","url":"https://github.com/googlesamples/android-architecture/issues/373"},"label":"DOCUMENTATION","_input_hash":-1157050643,"_task_hash":1898336563,"answer":"accept"} {"text":"Failing to compile with hw and xen ","meta":{"source":"GitHub","url":"https://github.com/rumpkernel/rumprun/issues/102"},"label":"DOCUMENTATION","_input_hash":1791394558,"_task_hash":-1039951089,"answer":"reject"} {"text":"# No readme file\n\ncan you add readme.md file with the detail description about the project? that would help ","title":"No readme file","body":"can you add readme.md file with the detail description about the project? that would help ","html":"

No readme file

\n\n

can you add readme.md file with the detail description about the project? that would help

\n","meta":{"source":"GitHub","url":"https://github.com/saka10101/TOURISM-ANALYSIS-AND-PREDICTIONS-IN-NEPAL/issues/9"},"_input_hash":634140719,"_task_hash":1384476301,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# Error in auth0-spa-js\n\nAn error is happening when logged in:\r\n`ERROR TypeError: Cannot read property 'close' of undefined`\r\n\r\nThis issue is discussed here:\r\nhttps://community.auth0.com/t/typeerror-cannot-read-property-close-of-undefined-when-using-new-auth0-spa-library-in-angular-interceptor/28010\r\n\r\nThere was an update to the Auth0 documentation when setting up the auth service: \r\nhttps://auth0.com/docs/quickstart/spa/angular2","title":"Error in auth0-spa-js","body":"An error is happening when logged in:\r\n`ERROR TypeError: Cannot read property 'close' of undefined`\r\n\r\nThis issue is discussed here:\r\nhttps://community.auth0.com/t/typeerror-cannot-read-property-close-of-undefined-when-using-new-auth0-spa-library-in-angular-interceptor/28010\r\n\r\nThere was an update to the Auth0 documentation when setting up the auth service: \r\nhttps://auth0.com/docs/quickstart/spa/angular2","html":"

Error in auth0-spa-js

\n\n

An error is happening when logged in:\nERROR TypeError: Cannot read property 'close' of undefined

\n\n

This issue is discussed here:\nhttps://community.auth0.com/t/typeerror-cannot-read-property-close-of-undefined-when-using-new-auth0-spa-library-in-angular-interceptor/28010

\n\n

There was an update to the Auth0 documentation when setting up the auth service: \nhttps://auth0.com/docs/quickstart/spa/angular2

\n","meta":{"source":"GitHub","url":"https://github.com/dknight10/skystats/issues/3"},"_input_hash":-1428886199,"_task_hash":132167115,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# edit2_ERimage for README.md\n\n![](https://i.gyazo.com/a11f7c52c200a66a62321f5cab168f78.png)\r\n","title":"edit2_ERimage for README.md","body":"![](https://i.gyazo.com/a11f7c52c200a66a62321f5cab168f78.png)\r\n","html":"

edit2_ERimage for README.md

\n\n

\"\"

\n","meta":{"source":"GitHub","url":"https://github.com/jinugasachio/mercari_team-e/issues/22"},"_input_hash":-1222508491,"_task_hash":23506506,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"FieldDoesNotExist on Customising generated forms","meta":{"source":"GitHub","url":"https://github.com/wagtail/wagtail/issues/3737"},"label":"DOCUMENTATION","_input_hash":-2138863010,"_task_hash":-598961847,"answer":"reject"} {"text":"# Visual Guide in GUI for Different StreamDeck Models\n\n**Describe the feature**\r\nA simple rectangle around the buttons used by each of the smaller StreamDeck models when programming in Companion v2.0 or higher, as a reminder of button limitations when programming for StreamDeck Mini or the original StreamDeck\r\n\r\n**Is this platform dependent (windows, mac, ..)?**\r\nNo\r\n\r\n**If documentation is required to implement, do you know where to find it?**\r\nI am unaware of any documentation for this.\r\n\r\n**Usecases**\r\nI believe that this would be helpful for those new to programming in Companion, as they would be able to see the constraints of the device that they are programming for.\r\n\r\nIt would also help those with multiple StreamDeck versions, to serve as a reminder of the constraints of each version. Especially since in 2.0 and higher, we the ability to use the far left column as something other than page switching buttons, having that visual guide would be a great help.\r\n\r\nThe image is just a mock-up of what this could look like, with the constraints of the StreamDeck Mini shown on the regular size StreamDeck. I know that if the guide were in place with Companion 1.4 and below, the actual buttons in use for a StreamDeck mini would be only the top left 4, but this is just meant to serve as an example of the type of visual guide to which I'm referring.\r\n![Screen Shot 2019-08-11 at 11 34 23](https://user-images.githubusercontent.com/40829374/62828635-aab4bd80-bc2e-11e9-8109-a0a2c5b30009.png)\r\n\r\n","title":"Visual Guide in GUI for Different StreamDeck Models","body":"**Describe the feature**\r\nA simple rectangle around the buttons used by each of the smaller StreamDeck models when programming in Companion v2.0 or higher, as a reminder of button limitations when programming for StreamDeck Mini or the original StreamDeck\r\n\r\n**Is this platform dependent (windows, mac, ..)?**\r\nNo\r\n\r\n**If documentation is required to implement, do you know where to find it?**\r\nI am unaware of any documentation for this.\r\n\r\n**Usecases**\r\nI believe that this would be helpful for those new to programming in Companion, as they would be able to see the constraints of the device that they are programming for.\r\n\r\nIt would also help those with multiple StreamDeck versions, to serve as a reminder of the constraints of each version. Especially since in 2.0 and higher, we the ability to use the far left column as something other than page switching buttons, having that visual guide would be a great help.\r\n\r\nThe image is just a mock-up of what this could look like, with the constraints of the StreamDeck Mini shown on the regular size StreamDeck. I know that if the guide were in place with Companion 1.4 and below, the actual buttons in use for a StreamDeck mini would be only the top left 4, but this is just meant to serve as an example of the type of visual guide to which I'm referring.\r\n![Screen Shot 2019-08-11 at 11 34 23](https://user-images.githubusercontent.com/40829374/62828635-aab4bd80-bc2e-11e9-8109-a0a2c5b30009.png)\r\n\r\n","html":"

Visual Guide in GUI for Different StreamDeck Models

\n\n

Describe the feature\nA simple rectangle around the buttons used by each of the smaller StreamDeck models when programming in Companion v2.0 or higher, as a reminder of button limitations when programming for StreamDeck Mini or the original StreamDeck

\n\n

Is this platform dependent (windows, mac, ..)?\nNo

\n\n

If documentation is required to implement, do you know where to find it?\nI am unaware of any documentation for this.

\n\n

Usecases\nI believe that this would be helpful for those new to programming in Companion, as they would be able to see the constraints of the device that they are programming for.

\n\n

It would also help those with multiple StreamDeck versions, to serve as a reminder of the constraints of each version. Especially since in 2.0 and higher, we the ability to use the far left column as something other than page switching buttons, having that visual guide would be a great help.

\n\n

The image is just a mock-up of what this could look like, with the constraints of the StreamDeck Mini shown on the regular size StreamDeck. I know that if the guide were in place with Companion 1.4 and below, the actual buttons in use for a StreamDeck mini would be only the top left 4, but this is just meant to serve as an example of the type of visual guide to which I'm referring.\n\"Screen

\n","meta":{"source":"GitHub","url":"https://github.com/bitfocus/companion/issues/782"},"_input_hash":549135609,"_task_hash":599815177,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Translate homepage texts and categories","meta":{"source":"GitHub","url":"https://github.com/arduino/reference-jp/issues/13"},"label":"DOCUMENTATION","_input_hash":-281845711,"_task_hash":-1901356329,"answer":"accept"} {"text":"Is this tool supported?","meta":{"source":"GitHub","url":"https://github.com/corbel-platform/corbel/issues/19"},"label":"DOCUMENTATION","_input_hash":776144129,"_task_hash":21955354,"answer":"reject"} {"text":"Failed Pre-Flight Check","meta":{"source":"GitHub","url":"https://github.com/snipe/snipe-it/issues/3787"},"label":"DOCUMENTATION","_input_hash":1004723498,"_task_hash":-399479667,"answer":"reject"} {"text":"How to add prefix to id?","meta":{"source":"GitHub","url":"https://github.com/valeriangalliat/markdown-it-anchor/issues/29"},"label":"DOCUMENTATION","_input_hash":-1187776007,"_task_hash":-1064955666,"answer":"reject"} {"text":"Docs","meta":{"source":"GitHub","url":"https://github.com/Girbons/rls-sdk/issues/2"},"label":"DOCUMENTATION","_input_hash":1154558741,"_task_hash":-1930213697,"answer":"accept"} {"text":"Support images in the List block","meta":{"source":"GitHub","url":"https://github.com/WordPress/gutenberg/issues/2042"},"label":"DOCUMENTATION","_input_hash":1606599865,"_task_hash":89812603,"answer":"reject"} {"text":"Conan docs incorrect.","meta":{"source":"GitHub","url":"https://github.com/conan-io/conan/issues/1547"},"label":"DOCUMENTATION","_input_hash":-1987011667,"_task_hash":-1521279003,"answer":"accept"} {"text":"tf.estimate quickstart","meta":{"source":"GitHub","url":"https://github.com/tensorflow/tensorflow/issues/11783"},"label":"DOCUMENTATION","_input_hash":-900889441,"_task_hash":1692413360,"answer":"reject"} {"text":"Amp-Analytics - feature ability to trigger multiple GA Event tags on page load","meta":{"source":"GitHub","url":"https://github.com/ampproject/amphtml/issues/10667"},"label":"DOCUMENTATION","_input_hash":-1963996218,"_task_hash":-1802797919,"answer":"reject"} {"text":"Replace tutorials in the user guide with links to the guides","meta":{"source":"GitHub","url":"https://github.com/gradle/guides/issues/156"},"label":"DOCUMENTATION","_input_hash":1942366496,"_task_hash":-24024176,"answer":"accept"} {"text":"# More documentation updates\n\n- [ ] Special css classes\r\n- [ ] Filenames should be unique","title":"More documentation updates","body":"- [ ] Special css classes\r\n- [ ] Filenames should be unique","html":"

More documentation updates

\n\n
    \n
  • [ ] Special css classes
  • \n
  • [ ] Filenames should be unique
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/magnusdahlgren/magnetizer/issues/60"},"_input_hash":287895235,"_task_hash":1643790979,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"[feat]: IssuesEvent","meta":{"source":"GitHub","url":"https://github.com/bleenco/abstruse/issues/24"},"label":"DOCUMENTATION","_input_hash":-745258174,"_task_hash":1841083377,"answer":"reject"} {"text":"# iOS deploy not working\n\nThis is the error I get after following all steps out of the documentation. Using Unity3D v.2019.3.0a11 and the latest version of flutter-unity-view-widget and the latest version of Flutter:\r\n\r\n2019-08-11 00:35:01.400320+0200 Runner[1571:278941] Built from 'trunk' branch, Version '2019.3.0a11 (6fa9444d8a5d)', Build type 'Release', Scripting Backend 'il2cpp'\r\nno boot config - using default values\r\n \r\n\r\nCould you provide a fix or help.\r\n\r\nThank you.","title":"iOS deploy not working","body":"This is the error I get after following all steps out of the documentation. Using Unity3D v.2019.3.0a11 and the latest version of flutter-unity-view-widget and the latest version of Flutter:\r\n\r\n2019-08-11 00:35:01.400320+0200 Runner[1571:278941] Built from 'trunk' branch, Version '2019.3.0a11 (6fa9444d8a5d)', Build type 'Release', Scripting Backend 'il2cpp'\r\nno boot config - using default values\r\n \r\n\r\nCould you provide a fix or help.\r\n\r\nThank you.","html":"

iOS deploy not working

\n\n

This is the error I get after following all steps out of the documentation. Using Unity3D v.2019.3.0a11 and the latest version of flutter-unity-view-widget and the latest version of Flutter:

\n\n

2019-08-11 00:35:01.400320+0200 Runner[1571:278941] Built from 'trunk' branch, Version '2019.3.0a11 (6fa9444d8a5d)', Build type 'Release', Scripting Backend 'il2cpp'\nno boot config - using default values

\n\n

Could you provide a fix or help.

\n\n

Thank you.

\n","meta":{"source":"GitHub","url":"https://github.com/snowballdigital/flutter-unity-view-widget/issues/30"},"_input_hash":-1706382550,"_task_hash":77697530,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"undefined method `head?' for nil:NilClass when Golang has update","meta":{"source":"GitHub","url":"https://github.com/Homebrew/brew/issues/2944"},"label":"DOCUMENTATION","_input_hash":749943835,"_task_hash":193268864,"answer":"reject"} {"text":"# Add new logo to readme\n\n","title":"Add new logo to readme","body":"","html":"

Add new logo to readme

\n","meta":{"source":"GitHub","url":"https://github.com/ChilliCream/hotchocolate/issues/1000"},"_input_hash":-1463427254,"_task_hash":-1617213300,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"HTML documentation on UntrustedRootOK looks strange","meta":{"source":"GitHub","url":"https://github.com/ForNeVeR/Jabber-Net/issues/89"},"label":"DOCUMENTATION","_input_hash":197195064,"_task_hash":-656707263,"answer":"accept"} {"text":"Outdated README.md","meta":{"source":"GitHub","url":"https://github.com/microsoftgraph/android-java-snippets-rest-sample/issues/31"},"label":"DOCUMENTATION","_input_hash":1733125356,"_task_hash":141856487,"answer":"accept"} {"text":"broken link in ABOUT","meta":{"source":"GitHub","url":"https://github.com/autokey-py3/autokey/issues/94"},"label":"DOCUMENTATION","_input_hash":889602624,"_task_hash":-948366756,"answer":"accept"} {"text":"[feature] Tizen support","meta":{"source":"GitHub","url":"https://github.com/Placeholder-Software/Dissonance/issues/42"},"label":"DOCUMENTATION","_input_hash":121737917,"_task_hash":-1234977562,"answer":"reject"} {"text":"README text / jQuery removal?","meta":{"source":"GitHub","url":"https://github.com/nathanvda/cocoon/issues/441"},"label":"DOCUMENTATION","_input_hash":2077850099,"_task_hash":1097175349,"answer":"accept"} {"text":"# SDCard not showing up\n\nIm trying to install windows 10 Pro Build 18956, but in GUI my sdcard not showing up, and in cmd line `\"C:\\Users\\Administrador\\Downloads\\WOA.Deployer\\WoaDeployer.exe\" deploy --disk 2 --wim install.wim` give me some erros \r\n\r\n**Screenshots**\r\nMy SDCARD showing normaly on diskmgmt.msc\r\n![image](https://user-images.githubusercontent.com/3611208/62829468-95ab4080-bbd4-11e9-843f-0acff64d775d.png)\r\nand on explorer\r\n![image](https://user-images.githubusercontent.com/3611208/62829437-cb9bf500-bbd3-11e9-86c8-bb306f4fb0df.png)\r\nUsing GUI utility\r\n![image](https://user-images.githubusercontent.com/3611208/62829408-3e58a080-bbd3-11e9-8757-4212c7aa4d9c.png)\r\nUsing cmd line utility\r\n![image](https://user-images.githubusercontent.com/3611208/62829392-cdb18400-bbd2-11e9-95fc-3f53caaab13e.png)\r\n\r\n\r\n**Log file**\r\nGUI logs:\r\n[Log-20190811.txt](https://github.com/WOA-Project/WOA-Deployer-Rpi/files/3489494/Log-20190811.txt)\r\nCMD Logs:\r\n```\r\nC:\\Users\\Administrador\\Downloads\\WOA.Deployer>\r\n[00:55:42 INF] Downloading UEFI\r\n[00:55:42 WRN] UEFI was already downloaded. Skipping download.\r\n[00:55:42 INF] Fetching zip from http://ww1.microchip.com/downloads/en//softwarelibrary/obj-lan95xx-windows/lan9500-wdf-v18.12.18.0.zip\r\n[00:55:42 INF] Fetching zip from http://ww1.microchip.com/downloads/en//softwarelibrary/obj-lan78xx-windows/lan7800-wdf-v18.12.14.0.zip\r\n[00:55:42 INF] Fetching zip from https://pi64.win/wp-content/uploads/2019/02/usb-drivers.zip\r\n[00:55:42 INF] Fetching from GitHub subfolder: https://github.com/driver1998/bsp/bsp-master/prebuilt to Downloaded\\Drivers\\BSP Drivers\r\n[00:55:42 WRN] https://github.com/driver1998/bspbsp-master/prebuilt was already downloaded. Skipping download.\r\n[00:55:42 INF] License from Downloaded\\Drivers\\USB\\license.md\r\nBy continuing you are accepting the following license below.\r\nIf you decline it, press Control+C anytime during the deployment process.\r\n# TrueTask\u00ae USB\r\n## ARM64 host drivers for Windows 10 on the Raspberry Pi\r\n### Software License Agreement and Warranty Statement\r\n\r\nMCCI IS WILLING TO LICENSE THE SOFTWARE TO YOU ONLY UPON THE CONDITION THAT YOU ACCEPT ALL OF THE TERMS CONTAINED IN THIS LICENSE AGREEMENT. PLEASE READ THE TERMS CAREFULLY AND CLICK ON \"ACCEPT\" BEFORE INSTALLING THE SOFTWARE, AS CLICKING ON \"ACCEPT\" AND INSTALLING THE SOFTWARE WILL INDICATE YOUR AGREEMENT WITH THEM. IF YOU DO NOT AGREE WITH THESE TERMS, THEN MCCI IS UNWILLING TO LICENSE THE SOFTWARE TO YOU, IN WHICH EVENT YOU SHOULD NOT PROCEED WITH INSTALLING THE SOFTWARE.\r\n\r\nTO THE EXTENT THESE TERMS CONFLICT WITH ANY PREVIOUSLY SIGNED AND WRITTEN AGREED TO TERMS BETWEEN YOU AND MCCI, THE PRIOR TERMS SHALL CONTROL.\r\n\r\nThe software which accompanies this license (the \"Software\") and any accompanying documentation (the \"Documentation\") is the property of MCCI Corporation (\"MCCI\") or its licensors and is protected by copyright law. This agreement is between MCCI and the individual who downloads and wishes to use the Software and Documentation (\"You\"). While MCCI continues to own the software, You will have certain rights to use the Software upon your acceptance of this license. MCCI grants You a non-transferable and non-exclusive license to use this Software under the terms of this agreement. MCCI remains the proprietor of this software and licenses its use to You. You do not obtain title to the Software or Documentation or any copyrights or proprietary rights in the software. You assume responsibility for the selection of the Software to achieve your intended results, and for the installation, use and results obtained from the Software. Additional rights and obligations regarding the Software and its contents, and/or the Documentation may be defined by a separate written agreement with MCCI, and if so, such separate written agreement shall be controlling.\r\n\r\nIn the absence of conflict of such separate written agreement or except as may be modified by such a license addendum which accompanies this license, your rights and obligations with respect to use of this Software and Documentation are as follows:\r\n\r\nThis license does NOT extend to any corporation or organization of which You are a member or with which You are affiliated. It is only for personal use. Commercial licenses for corporations and organizations are available from MCCI.\r\n\r\nYOU MAY: License Grant: You are granted non-exclusive rights to install and use the Software for personal use and evaluation purposes only. You may install on any Raspberry Pi computer that you personally own, provided that You acquire and dedicate a licensed copy of the Software for each computer on which the Software is used or to which it is transmitted over the internal network. You may also make backup copies of the Software.\r\n\r\nRESTRICTIONS YOU MAY NOT: (i) permit others to use the Software, except as expressly provided above for authorized network use; (ii) modify or translate the Software; (iii) reverse engineer, decompile, or disassemble the Software, except to the extent this restriction is expressly prohibited by applicable law; (iv) create derivative works based on the Software; (v) merge the Software with another product; (vi) export or use the Software data compilations, structures, or algorithms with another product; (vii) copy the Software, except as expressly provided above; (viii) remove or obscure any proprietary rights notices or labels on the Software; (ix) post the software on a website for public download, or (x) resell or distribute the Software, either stand-alone or bundled with or installed with hardware or software supplied by You or others, or (xi) distribute the Software or Documentation in any form (electronic or otherwise).\r\n\r\nTERM AND TERMINATION. The license provided in this Agreement will continue in perpetuity unless You fail to comply with the terms and conditions of this Agreement. You agree that, upon such termination, you will either destroy (or permanently erase) all copies of the Software and Documentation, or return the original Software and Documentation to MCCI, together with any other material you have received from MCCI in connection with the Software.\r\n\r\nTRANSFERS. You may not transfer the Software or any rights under this Agreement without the prior written consent of MCCI. Any attempted transfer or assignment in violation of this provision shall be null and void.\r\n\r\nFEEDBACK. You agree that in the event You voluntarily disclose any ideas or suggestions to MCCI (in any manner, whether in writing or orally or otherwise) regarding the Software, Documentation, or Design Techniques, including possible enhancements or improvements (\"Feedback\"), MCCI may freely use and disseminate such Feedback. You agree not to claim that MCCI owes You any compensation for its use or dissemination of such Feedback.\r\n\r\nOWNERSHIP. MCCI and its suppliers own the Software and all intellectual property rights embodied therein, including copyrights and valuable trade secrets embodied in the Software's design and coding methodology. The Software is protected by United States copyright laws and international treaty provisions. This Agreement provides You only a limited use license, and no ownership of any intellectual property. All content accessed through the Software is the property of the applicable content owner and may be protected by applicable copyright law. This license gives You no rights to such content.\r\n\r\nWRITTEN RECORD. The text of this Agreement is included with the Software files as \"ttusb-pi64-installer-license.rtf.\" You agree to print this text file immediately after installation of the Software and to maintain the printed copy as a written record of this transaction.\r\n\r\nDISCLAIMER OF WARRANTY; LIMITATION OF LIABILITY **.** MCCI PROVIDES THE SOFTWARE AND THE DOCUMENTATION \"AS IS\" WITHOUT WARRANTY OF ANY KIND EITHER EXPRESS IMPLIED OR STATUTORY, INCLUDING WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE. THERE IS NO WARRANTY OR GUARANTEE THAT THE OPERATION OF THE SOFTWARE WILL BE UNINTERRUPTED, ERROR-FREE, OR VIRUS-FREE, OR THAT THE SOFTWARE WILL MEET ANY PARTICULAR CRITERIA OF PERFORMANCE OR QUALITY EXCEPT AS EXPRESSLY PROVIDED IN THE LIMITED WARRANTY. All risk of quality and performance of the software and documentation is with You. This disclaimer of warranty constitutes an essential part of this Agreement.\r\n\r\nTo the extent that this Warranty Statement is inconsistent with the jurisdiction where You use the Software, the Warranty Statement shall be deemed to be modified consistent with such local law but to the maximum extent enforceable in such jurisdiction. Under such local law, certain limitations may not apply, and You may have additional rights which vary from jurisdiction to jurisdiction. For example, some states in the United States and some jurisdictions outside the United States may: (i) preclude the disclaimers and limitations of this Warranty Statement from limiting the rights of a consumer; (ii) otherwise restrict the ability of a manufacturer to make such disclaimers or to impose such limitations; or (iii) grant the consumer additional legal rights, specify the duration of implied warranties which the manufacturer cannot disclaim, or prohibit limitations on how long an implied warranty lasts.\r\n\r\nIN NO EVENT AND UNDER NO LEGAL THEORY, INCLUDING WITHOUT LIMITATION, TORT, CONTRACT, OR STRICT PRODUCTS LIABILITY, SHALL MCCI OR ANY OF ITS SUPPLIERS BE LIABLE TO YOU OR ANY OTHER PERSON FOR ANY PERSONAL INJURY, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES OF ANY KIND, INCLUDING WITHOUT LIMITATION, DAMAGES FOR LOSS OF GOODWILL, WORK STOPPAGE, COMPUTER MALFUNCTION, OR ANY OTHER KIND OF COMMERCIAL DAMAGE, EVEN IF MCCI HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. IN NO EVENT SHALL MCCI BE LIABLE FOR DAMAGES IN EXCESS OF THE AMOUNT PAID BY YOU TO MCCI OR THIS SOFTWARE LICENSE. SOME JURISDICTIONS DO NOT ALLOW THE LIMITATION OF LIABILITY FOR PERSONAL INJURY, OR OF INCIDENTAL OR CONSEQUENTIAL DAMAGES, SO THIS LIMITATION MAY NOT APPLY TO YOU. In no event shall MCCI's total liability to You for all damages (other than as may be required by applicable law in cases involving personal injury) exceed the amount of fifty dollars ($50.00). The foregoing limitations will apply even if the above stated remedy fails of its essential purpose.\r\n\r\nEXPORT CONTROLS. You may not use or otherwise export the Software except as authorized by United States law and the laws of the jurisdiction in which the Software was obtained. In particular, but without limitation, the Software or underlying information or technology may not be exported or re-exported (i) into (or to a national or resident of) Cuba, Libya, North Korea, Iran, Syria or any other country to which the United States has embargoed goods; or (ii) to anyone on the U.S. Treasury Department's list of Specially Designated Nationals, the U.S. Commerce Department's Table of Denial Orders, or the U.S. Department of Commerce Denied Person's List or Entity List. By downloading, ordering or using the Software, You agree to the foregoing and represent that You are not located in, under the control of, or a national or resident of any such country or on any such list. You also agree that you will not use the Software for any purposes prohibited by United States law, including, without limitation, the development, design, manufacture or production of missiles, or nuclear, chemical or biological weapons.\r\n\r\nMISCELLANEOUS. This Agreement constitutes the entire understanding of the parties with respect to the subject matter of this Agreement and merges all prior communications, representations, and agreements. This Agreement may be modified only by a written agreement signed by the parties. If any provision of this Agreement is held to be unenforceable for any reason, such provision shall be reformed only to the extent necessary to make it enforceable. This Agreement shall be construed under the laws of the State of New York, USA, excluding rules regarding conflicts of law. The application of the United Nations Convention of Contracts for the International Sale of Goods is expressly excluded.\r\n\r\nUNITED STATES GOVERNMENT USE. MCCI represents that the Software and its documentation were developed at private expense and no part of same is in the public domain. The Software is Commercial Computer Software provided with RESTRICTED RIGHTS under the Federal Acquisition Regulations and agency supplements to them. Use, duplication, or disclosure by the U.S. Government is subject to the restrictions as set forth in the Rights in Technical Data and Computer Software clause at DFAR 252.227-7013 et. seq. or the Commercial Computer Software Restricted Rights at DFAR 52.227-19, as applicable. Contractor is MCCI Corporation, 3520 Krums Corners Road, Ithaca, NY 14850, USA.\r\n\r\nTrueTask and MCCI are registered trademarks of MCCI Corporation.\r\n\r\n\r\n\r\n[00:55:42 INF] Deploying Windows\r\n[00:55:45 FTL] Operation failed\r\nSystem.InvalidOperationException: A sequ\u00eancia n\u00e3o cont\u00e9m elementos de correspond\u00eancia\r\n em System.Linq.Enumerable.First[TSource](IEnumerable`1 source, Func`2 predicate)\r\n em Deployer.Raspberry.RaspberryPi.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Tasks.DeployWindows.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Execution.ScriptRunner.d__5.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Execution.ScriptRunner.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.WoaDeployer.d__3.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.Console.Program.d__1.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 49\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.Console.Program.
d__0.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 28\r\n\r\nExce\u00e7\u00e3o Sem Tratamento: System.AggregateException: Um ou mais erros. ---> System.InvalidOperationException: A sequ\u00eancia n\u00e3o cont\u00e9m elementos de correspond\u00eancia\r\n em System.Linq.Enumerable.First[TSource](IEnumerable`1 source, Func`2 predicate)\r\n em Deployer.Raspberry.RaspberryPi.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Tasks.DeployWindows.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Execution.ScriptRunner.d__5.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Execution.ScriptRunner.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.WoaDeployer.d__3.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.Console.Program.d__1.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 49\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.Console.Program.
d__0.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 36\r\n --- Fim do rastreamento de pilha de exce\u00e7\u00f5es internas ---\r\n em System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)\r\n em System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken)\r\n em Deployer.Raspberry.Gui.App.<>c__DisplayClass2_0.b__0() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Gui\\App.xaml.cs:linha 40\r\n em Deployer.Gui.Common.ConsoleEmbedder.ExecuteInsideConsole(Action consoleAction) na D:\\a\\1\\s\\Source\\DeployerPlatform\\Deployer.Gui.Common\\ConsoleEmbedder.cs:linha 33\r\n em Deployer.Raspberry.Gui.App.LaunchConsole(String[] args) na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Gui\\App.xaml.cs:linha 40\r\n em Deployer.Raspberry.Gui.App.OnStartup(StartupEventArgs e) na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Gui\\App.xaml.cs:linha 21\r\n em System.Windows.Application.<.ctor>b__1_0(Object unused)\r\n em System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)\r\n em System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler)\r\n em System.Windows.Threading.DispatcherOperation.InvokeImpl()\r\n em System.Windows.Threading.DispatcherOperation.InvokeInSecurityContext(Object state)\r\n em MS.Internal.CulturePreservingExecutionContext.CallbackWrapper(Object obj)\r\n em System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)\r\n em System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)\r\n em System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)\r\n em MS.Internal.CulturePreservingExecutionContext.Run(CulturePreservingExecutionContext executionContext, ContextCallback callback, Object state)\r\n em System.Windows.Threading.DispatcherOperation.Invoke()\r\n em System.Windows.Threading.Dispatcher.ProcessQueue()\r\n em System.Windows.Threading.Dispatcher.WndProcHook(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)\r\n em MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)\r\n em MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o)\r\n em System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)\r\n em System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler)\r\n em System.Windows.Threading.Dispatcher.LegacyInvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs)\r\n em MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam)\r\n em MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg)\r\n em System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame)\r\n em System.Windows.Threading.Dispatcher.PushFrame(DispatcherFrame frame)\r\n em System.Windows.Application.RunDispatcher(Object ignore)\r\n em System.Windows.Application.RunInternal(Window window)\r\n em System.Windows.Application.Run(Window window)\r\n em Deployer.Raspberry.Gui.App.Main()\r\n```\r\n","title":"SDCard not showing up","body":"Im trying to install windows 10 Pro Build 18956, but in GUI my sdcard not showing up, and in cmd line `\"C:\\Users\\Administrador\\Downloads\\WOA.Deployer\\WoaDeployer.exe\" deploy --disk 2 --wim install.wim` give me some erros \r\n\r\n**Screenshots**\r\nMy SDCARD showing normaly on diskmgmt.msc\r\n![image](https://user-images.githubusercontent.com/3611208/62829468-95ab4080-bbd4-11e9-843f-0acff64d775d.png)\r\nand on explorer\r\n![image](https://user-images.githubusercontent.com/3611208/62829437-cb9bf500-bbd3-11e9-86c8-bb306f4fb0df.png)\r\nUsing GUI utility\r\n![image](https://user-images.githubusercontent.com/3611208/62829408-3e58a080-bbd3-11e9-8757-4212c7aa4d9c.png)\r\nUsing cmd line utility\r\n![image](https://user-images.githubusercontent.com/3611208/62829392-cdb18400-bbd2-11e9-95fc-3f53caaab13e.png)\r\n\r\n\r\n**Log file**\r\nGUI logs:\r\n[Log-20190811.txt](https://github.com/WOA-Project/WOA-Deployer-Rpi/files/3489494/Log-20190811.txt)\r\nCMD Logs:\r\n```\r\nC:\\Users\\Administrador\\Downloads\\WOA.Deployer>\r\n[00:55:42 INF] Downloading UEFI\r\n[00:55:42 WRN] UEFI was already downloaded. Skipping download.\r\n[00:55:42 INF] Fetching zip from http://ww1.microchip.com/downloads/en//softwarelibrary/obj-lan95xx-windows/lan9500-wdf-v18.12.18.0.zip\r\n[00:55:42 INF] Fetching zip from http://ww1.microchip.com/downloads/en//softwarelibrary/obj-lan78xx-windows/lan7800-wdf-v18.12.14.0.zip\r\n[00:55:42 INF] Fetching zip from https://pi64.win/wp-content/uploads/2019/02/usb-drivers.zip\r\n[00:55:42 INF] Fetching from GitHub subfolder: https://github.com/driver1998/bsp/bsp-master/prebuilt to Downloaded\\Drivers\\BSP Drivers\r\n[00:55:42 WRN] https://github.com/driver1998/bspbsp-master/prebuilt was already downloaded. Skipping download.\r\n[00:55:42 INF] License from Downloaded\\Drivers\\USB\\license.md\r\nBy continuing you are accepting the following license below.\r\nIf you decline it, press Control+C anytime during the deployment process.\r\n# TrueTask\u00ae USB\r\n## ARM64 host drivers for Windows 10 on the Raspberry Pi\r\n### Software License Agreement and Warranty Statement\r\n\r\nMCCI IS WILLING TO LICENSE THE SOFTWARE TO YOU ONLY UPON THE CONDITION THAT YOU ACCEPT ALL OF THE TERMS CONTAINED IN THIS LICENSE AGREEMENT. PLEASE READ THE TERMS CAREFULLY AND CLICK ON \"ACCEPT\" BEFORE INSTALLING THE SOFTWARE, AS CLICKING ON \"ACCEPT\" AND INSTALLING THE SOFTWARE WILL INDICATE YOUR AGREEMENT WITH THEM. IF YOU DO NOT AGREE WITH THESE TERMS, THEN MCCI IS UNWILLING TO LICENSE THE SOFTWARE TO YOU, IN WHICH EVENT YOU SHOULD NOT PROCEED WITH INSTALLING THE SOFTWARE.\r\n\r\nTO THE EXTENT THESE TERMS CONFLICT WITH ANY PREVIOUSLY SIGNED AND WRITTEN AGREED TO TERMS BETWEEN YOU AND MCCI, THE PRIOR TERMS SHALL CONTROL.\r\n\r\nThe software which accompanies this license (the \"Software\") and any accompanying documentation (the \"Documentation\") is the property of MCCI Corporation (\"MCCI\") or its licensors and is protected by copyright law. This agreement is between MCCI and the individual who downloads and wishes to use the Software and Documentation (\"You\"). While MCCI continues to own the software, You will have certain rights to use the Software upon your acceptance of this license. MCCI grants You a non-transferable and non-exclusive license to use this Software under the terms of this agreement. MCCI remains the proprietor of this software and licenses its use to You. You do not obtain title to the Software or Documentation or any copyrights or proprietary rights in the software. You assume responsibility for the selection of the Software to achieve your intended results, and for the installation, use and results obtained from the Software. Additional rights and obligations regarding the Software and its contents, and/or the Documentation may be defined by a separate written agreement with MCCI, and if so, such separate written agreement shall be controlling.\r\n\r\nIn the absence of conflict of such separate written agreement or except as may be modified by such a license addendum which accompanies this license, your rights and obligations with respect to use of this Software and Documentation are as follows:\r\n\r\nThis license does NOT extend to any corporation or organization of which You are a member or with which You are affiliated. It is only for personal use. Commercial licenses for corporations and organizations are available from MCCI.\r\n\r\nYOU MAY: License Grant: You are granted non-exclusive rights to install and use the Software for personal use and evaluation purposes only. You may install on any Raspberry Pi computer that you personally own, provided that You acquire and dedicate a licensed copy of the Software for each computer on which the Software is used or to which it is transmitted over the internal network. You may also make backup copies of the Software.\r\n\r\nRESTRICTIONS YOU MAY NOT: (i) permit others to use the Software, except as expressly provided above for authorized network use; (ii) modify or translate the Software; (iii) reverse engineer, decompile, or disassemble the Software, except to the extent this restriction is expressly prohibited by applicable law; (iv) create derivative works based on the Software; (v) merge the Software with another product; (vi) export or use the Software data compilations, structures, or algorithms with another product; (vii) copy the Software, except as expressly provided above; (viii) remove or obscure any proprietary rights notices or labels on the Software; (ix) post the software on a website for public download, or (x) resell or distribute the Software, either stand-alone or bundled with or installed with hardware or software supplied by You or others, or (xi) distribute the Software or Documentation in any form (electronic or otherwise).\r\n\r\nTERM AND TERMINATION. The license provided in this Agreement will continue in perpetuity unless You fail to comply with the terms and conditions of this Agreement. You agree that, upon such termination, you will either destroy (or permanently erase) all copies of the Software and Documentation, or return the original Software and Documentation to MCCI, together with any other material you have received from MCCI in connection with the Software.\r\n\r\nTRANSFERS. You may not transfer the Software or any rights under this Agreement without the prior written consent of MCCI. Any attempted transfer or assignment in violation of this provision shall be null and void.\r\n\r\nFEEDBACK. You agree that in the event You voluntarily disclose any ideas or suggestions to MCCI (in any manner, whether in writing or orally or otherwise) regarding the Software, Documentation, or Design Techniques, including possible enhancements or improvements (\"Feedback\"), MCCI may freely use and disseminate such Feedback. You agree not to claim that MCCI owes You any compensation for its use or dissemination of such Feedback.\r\n\r\nOWNERSHIP. MCCI and its suppliers own the Software and all intellectual property rights embodied therein, including copyrights and valuable trade secrets embodied in the Software's design and coding methodology. The Software is protected by United States copyright laws and international treaty provisions. This Agreement provides You only a limited use license, and no ownership of any intellectual property. All content accessed through the Software is the property of the applicable content owner and may be protected by applicable copyright law. This license gives You no rights to such content.\r\n\r\nWRITTEN RECORD. The text of this Agreement is included with the Software files as \"ttusb-pi64-installer-license.rtf.\" You agree to print this text file immediately after installation of the Software and to maintain the printed copy as a written record of this transaction.\r\n\r\nDISCLAIMER OF WARRANTY; LIMITATION OF LIABILITY **.** MCCI PROVIDES THE SOFTWARE AND THE DOCUMENTATION \"AS IS\" WITHOUT WARRANTY OF ANY KIND EITHER EXPRESS IMPLIED OR STATUTORY, INCLUDING WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE. THERE IS NO WARRANTY OR GUARANTEE THAT THE OPERATION OF THE SOFTWARE WILL BE UNINTERRUPTED, ERROR-FREE, OR VIRUS-FREE, OR THAT THE SOFTWARE WILL MEET ANY PARTICULAR CRITERIA OF PERFORMANCE OR QUALITY EXCEPT AS EXPRESSLY PROVIDED IN THE LIMITED WARRANTY. All risk of quality and performance of the software and documentation is with You. This disclaimer of warranty constitutes an essential part of this Agreement.\r\n\r\nTo the extent that this Warranty Statement is inconsistent with the jurisdiction where You use the Software, the Warranty Statement shall be deemed to be modified consistent with such local law but to the maximum extent enforceable in such jurisdiction. Under such local law, certain limitations may not apply, and You may have additional rights which vary from jurisdiction to jurisdiction. For example, some states in the United States and some jurisdictions outside the United States may: (i) preclude the disclaimers and limitations of this Warranty Statement from limiting the rights of a consumer; (ii) otherwise restrict the ability of a manufacturer to make such disclaimers or to impose such limitations; or (iii) grant the consumer additional legal rights, specify the duration of implied warranties which the manufacturer cannot disclaim, or prohibit limitations on how long an implied warranty lasts.\r\n\r\nIN NO EVENT AND UNDER NO LEGAL THEORY, INCLUDING WITHOUT LIMITATION, TORT, CONTRACT, OR STRICT PRODUCTS LIABILITY, SHALL MCCI OR ANY OF ITS SUPPLIERS BE LIABLE TO YOU OR ANY OTHER PERSON FOR ANY PERSONAL INJURY, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES OF ANY KIND, INCLUDING WITHOUT LIMITATION, DAMAGES FOR LOSS OF GOODWILL, WORK STOPPAGE, COMPUTER MALFUNCTION, OR ANY OTHER KIND OF COMMERCIAL DAMAGE, EVEN IF MCCI HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. IN NO EVENT SHALL MCCI BE LIABLE FOR DAMAGES IN EXCESS OF THE AMOUNT PAID BY YOU TO MCCI OR THIS SOFTWARE LICENSE. SOME JURISDICTIONS DO NOT ALLOW THE LIMITATION OF LIABILITY FOR PERSONAL INJURY, OR OF INCIDENTAL OR CONSEQUENTIAL DAMAGES, SO THIS LIMITATION MAY NOT APPLY TO YOU. In no event shall MCCI's total liability to You for all damages (other than as may be required by applicable law in cases involving personal injury) exceed the amount of fifty dollars ($50.00). The foregoing limitations will apply even if the above stated remedy fails of its essential purpose.\r\n\r\nEXPORT CONTROLS. You may not use or otherwise export the Software except as authorized by United States law and the laws of the jurisdiction in which the Software was obtained. In particular, but without limitation, the Software or underlying information or technology may not be exported or re-exported (i) into (or to a national or resident of) Cuba, Libya, North Korea, Iran, Syria or any other country to which the United States has embargoed goods; or (ii) to anyone on the U.S. Treasury Department's list of Specially Designated Nationals, the U.S. Commerce Department's Table of Denial Orders, or the U.S. Department of Commerce Denied Person's List or Entity List. By downloading, ordering or using the Software, You agree to the foregoing and represent that You are not located in, under the control of, or a national or resident of any such country or on any such list. You also agree that you will not use the Software for any purposes prohibited by United States law, including, without limitation, the development, design, manufacture or production of missiles, or nuclear, chemical or biological weapons.\r\n\r\nMISCELLANEOUS. This Agreement constitutes the entire understanding of the parties with respect to the subject matter of this Agreement and merges all prior communications, representations, and agreements. This Agreement may be modified only by a written agreement signed by the parties. If any provision of this Agreement is held to be unenforceable for any reason, such provision shall be reformed only to the extent necessary to make it enforceable. This Agreement shall be construed under the laws of the State of New York, USA, excluding rules regarding conflicts of law. The application of the United Nations Convention of Contracts for the International Sale of Goods is expressly excluded.\r\n\r\nUNITED STATES GOVERNMENT USE. MCCI represents that the Software and its documentation were developed at private expense and no part of same is in the public domain. The Software is Commercial Computer Software provided with RESTRICTED RIGHTS under the Federal Acquisition Regulations and agency supplements to them. Use, duplication, or disclosure by the U.S. Government is subject to the restrictions as set forth in the Rights in Technical Data and Computer Software clause at DFAR 252.227-7013 et. seq. or the Commercial Computer Software Restricted Rights at DFAR 52.227-19, as applicable. Contractor is MCCI Corporation, 3520 Krums Corners Road, Ithaca, NY 14850, USA.\r\n\r\nTrueTask and MCCI are registered trademarks of MCCI Corporation.\r\n\r\n\r\n\r\n[00:55:42 INF] Deploying Windows\r\n[00:55:45 FTL] Operation failed\r\nSystem.InvalidOperationException: A sequ\u00eancia n\u00e3o cont\u00e9m elementos de correspond\u00eancia\r\n em System.Linq.Enumerable.First[TSource](IEnumerable`1 source, Func`2 predicate)\r\n em Deployer.Raspberry.RaspberryPi.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Tasks.DeployWindows.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Execution.ScriptRunner.d__5.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Execution.ScriptRunner.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.WoaDeployer.d__3.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.Console.Program.d__1.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 49\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.Console.Program.
d__0.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 28\r\n\r\nExce\u00e7\u00e3o Sem Tratamento: System.AggregateException: Um ou mais erros. ---> System.InvalidOperationException: A sequ\u00eancia n\u00e3o cont\u00e9m elementos de correspond\u00eancia\r\n em System.Linq.Enumerable.First[TSource](IEnumerable`1 source, Func`2 predicate)\r\n em Deployer.Raspberry.RaspberryPi.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Tasks.DeployWindows.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Execution.ScriptRunner.d__5.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Execution.ScriptRunner.d__4.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.WoaDeployer.d__3.MoveNext()\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.Console.Program.d__1.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 49\r\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\r\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n em Deployer.Raspberry.Console.Program.
d__0.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 36\r\n --- Fim do rastreamento de pilha de exce\u00e7\u00f5es internas ---\r\n em System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)\r\n em System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken)\r\n em Deployer.Raspberry.Gui.App.<>c__DisplayClass2_0.b__0() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Gui\\App.xaml.cs:linha 40\r\n em Deployer.Gui.Common.ConsoleEmbedder.ExecuteInsideConsole(Action consoleAction) na D:\\a\\1\\s\\Source\\DeployerPlatform\\Deployer.Gui.Common\\ConsoleEmbedder.cs:linha 33\r\n em Deployer.Raspberry.Gui.App.LaunchConsole(String[] args) na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Gui\\App.xaml.cs:linha 40\r\n em Deployer.Raspberry.Gui.App.OnStartup(StartupEventArgs e) na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Gui\\App.xaml.cs:linha 21\r\n em System.Windows.Application.<.ctor>b__1_0(Object unused)\r\n em System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)\r\n em System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler)\r\n em System.Windows.Threading.DispatcherOperation.InvokeImpl()\r\n em System.Windows.Threading.DispatcherOperation.InvokeInSecurityContext(Object state)\r\n em MS.Internal.CulturePreservingExecutionContext.CallbackWrapper(Object obj)\r\n em System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)\r\n em System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)\r\n em System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)\r\n em MS.Internal.CulturePreservingExecutionContext.Run(CulturePreservingExecutionContext executionContext, ContextCallback callback, Object state)\r\n em System.Windows.Threading.DispatcherOperation.Invoke()\r\n em System.Windows.Threading.Dispatcher.ProcessQueue()\r\n em System.Windows.Threading.Dispatcher.WndProcHook(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)\r\n em MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)\r\n em MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o)\r\n em System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)\r\n em System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler)\r\n em System.Windows.Threading.Dispatcher.LegacyInvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs)\r\n em MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam)\r\n em MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg)\r\n em System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame)\r\n em System.Windows.Threading.Dispatcher.PushFrame(DispatcherFrame frame)\r\n em System.Windows.Application.RunDispatcher(Object ignore)\r\n em System.Windows.Application.RunInternal(Window window)\r\n em System.Windows.Application.Run(Window window)\r\n em Deployer.Raspberry.Gui.App.Main()\r\n```\r\n","html":"

SDCard not showing up

\n\n

Im trying to install windows 10 Pro Build 18956, but in GUI my sdcard not showing up, and in cmd line \"C:\\Users\\Administrador\\Downloads\\WOA.Deployer\\WoaDeployer.exe\" deploy --disk 2 --wim install.wim give me some erros

\n\n

Screenshots\nMy SDCARD showing normaly on diskmgmt.msc\n\"image\"\nand on explorer\n\"image\"\nUsing GUI utility\n\"image\"\nUsing cmd line utility\n\"image\"

\n\n

Log file\nGUI logs:\nLog-20190811.txt\nCMD Logs:\n```\nC:\\Users\\Administrador\\Downloads\\WOA.Deployer>\n[00:55:42 INF] Downloading UEFI\n[00:55:42 WRN] UEFI was already downloaded. Skipping download.\n[00:55:42 INF] Fetching zip from http://ww1.microchip.com/downloads/en//softwarelibrary/obj-lan95xx-windows/lan9500-wdf-v18.12.18.0.zip\n[00:55:42 INF] Fetching zip from http://ww1.microchip.com/downloads/en//softwarelibrary/obj-lan78xx-windows/lan7800-wdf-v18.12.14.0.zip\n[00:55:42 INF] Fetching zip from https://pi64.win/wp-content/uploads/2019/02/usb-drivers.zip\n[00:55:42 INF] Fetching from GitHub subfolder: https://github.com/driver1998/bsp/bsp-master/prebuilt to Downloaded\\Drivers\\BSP Drivers\n[00:55:42 WRN] https://github.com/driver1998/bspbsp-master/prebuilt was already downloaded. Skipping download.\n[00:55:42 INF] License from Downloaded\\Drivers\\USB\\license.md\nBy continuing you are accepting the following license below.\nIf you decline it, press Control+C anytime during the deployment process.

\n\n

TrueTask\u00ae USB

\n\n

ARM64 host drivers for Windows 10 on the Raspberry Pi

\n\n

Software License Agreement and Warranty Statement

\n\n

MCCI IS WILLING TO LICENSE THE SOFTWARE TO YOU ONLY UPON THE CONDITION THAT YOU ACCEPT ALL OF THE TERMS CONTAINED IN THIS LICENSE AGREEMENT. PLEASE READ THE TERMS CAREFULLY AND CLICK ON \"ACCEPT\" BEFORE INSTALLING THE SOFTWARE, AS CLICKING ON \"ACCEPT\" AND INSTALLING THE SOFTWARE WILL INDICATE YOUR AGREEMENT WITH THEM. IF YOU DO NOT AGREE WITH THESE TERMS, THEN MCCI IS UNWILLING TO LICENSE THE SOFTWARE TO YOU, IN WHICH EVENT YOU SHOULD NOT PROCEED WITH INSTALLING THE SOFTWARE.

\n\n

TO THE EXTENT THESE TERMS CONFLICT WITH ANY PREVIOUSLY SIGNED AND WRITTEN AGREED TO TERMS BETWEEN YOU AND MCCI, THE PRIOR TERMS SHALL CONTROL.

\n\n

The software which accompanies this license (the \"Software\") and any accompanying documentation (the \"Documentation\") is the property of MCCI Corporation (\"MCCI\") or its licensors and is protected by copyright law. This agreement is between MCCI and the individual who downloads and wishes to use the Software and Documentation (\"You\"). While MCCI continues to own the software, You will have certain rights to use the Software upon your acceptance of this license. MCCI grants You a non-transferable and non-exclusive license to use this Software under the terms of this agreement. MCCI remains the proprietor of this software and licenses its use to You. You do not obtain title to the Software or Documentation or any copyrights or proprietary rights in the software. You assume responsibility for the selection of the Software to achieve your intended results, and for the installation, use and results obtained from the Software. Additional rights and obligations regarding the Software and its contents, and/or the Documentation may be defined by a separate written agreement with MCCI, and if so, such separate written agreement shall be controlling.

\n\n

In the absence of conflict of such separate written agreement or except as may be modified by such a license addendum which accompanies this license, your rights and obligations with respect to use of this Software and Documentation are as follows:

\n\n

This license does NOT extend to any corporation or organization of which You are a member or with which You are affiliated. It is only for personal use. Commercial licenses for corporations and organizations are available from MCCI.

\n\n

YOU MAY: License Grant: You are granted non-exclusive rights to install and use the Software for personal use and evaluation purposes only. You may install on any Raspberry Pi computer that you personally own, provided that You acquire and dedicate a licensed copy of the Software for each computer on which the Software is used or to which it is transmitted over the internal network. You may also make backup copies of the Software.

\n\n

RESTRICTIONS YOU MAY NOT: (i) permit others to use the Software, except as expressly provided above for authorized network use; (ii) modify or translate the Software; (iii) reverse engineer, decompile, or disassemble the Software, except to the extent this restriction is expressly prohibited by applicable law; (iv) create derivative works based on the Software; (v) merge the Software with another product; (vi) export or use the Software data compilations, structures, or algorithms with another product; (vii) copy the Software, except as expressly provided above; (viii) remove or obscure any proprietary rights notices or labels on the Software; (ix) post the software on a website for public download, or (x) resell or distribute the Software, either stand-alone or bundled with or installed with hardware or software supplied by You or others, or (xi) distribute the Software or Documentation in any form (electronic or otherwise).

\n\n

TERM AND TERMINATION. The license provided in this Agreement will continue in perpetuity unless You fail to comply with the terms and conditions of this Agreement. You agree that, upon such termination, you will either destroy (or permanently erase) all copies of the Software and Documentation, or return the original Software and Documentation to MCCI, together with any other material you have received from MCCI in connection with the Software.

\n\n

TRANSFERS. You may not transfer the Software or any rights under this Agreement without the prior written consent of MCCI. Any attempted transfer or assignment in violation of this provision shall be null and void.

\n\n

FEEDBACK. You agree that in the event You voluntarily disclose any ideas or suggestions to MCCI (in any manner, whether in writing or orally or otherwise) regarding the Software, Documentation, or Design Techniques, including possible enhancements or improvements (\"Feedback\"), MCCI may freely use and disseminate such Feedback. You agree not to claim that MCCI owes You any compensation for its use or dissemination of such Feedback.

\n\n

OWNERSHIP. MCCI and its suppliers own the Software and all intellectual property rights embodied therein, including copyrights and valuable trade secrets embodied in the Software's design and coding methodology. The Software is protected by United States copyright laws and international treaty provisions. This Agreement provides You only a limited use license, and no ownership of any intellectual property. All content accessed through the Software is the property of the applicable content owner and may be protected by applicable copyright law. This license gives You no rights to such content.

\n\n

WRITTEN RECORD. The text of this Agreement is included with the Software files as \"ttusb-pi64-installer-license.rtf.\" You agree to print this text file immediately after installation of the Software and to maintain the printed copy as a written record of this transaction.

\n\n

DISCLAIMER OF WARRANTY; LIMITATION OF LIABILITY . MCCI PROVIDES THE SOFTWARE AND THE DOCUMENTATION \"AS IS\" WITHOUT WARRANTY OF ANY KIND EITHER EXPRESS IMPLIED OR STATUTORY, INCLUDING WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE. THERE IS NO WARRANTY OR GUARANTEE THAT THE OPERATION OF THE SOFTWARE WILL BE UNINTERRUPTED, ERROR-FREE, OR VIRUS-FREE, OR THAT THE SOFTWARE WILL MEET ANY PARTICULAR CRITERIA OF PERFORMANCE OR QUALITY EXCEPT AS EXPRESSLY PROVIDED IN THE LIMITED WARRANTY. All risk of quality and performance of the software and documentation is with You. This disclaimer of warranty constitutes an essential part of this Agreement.

\n\n

To the extent that this Warranty Statement is inconsistent with the jurisdiction where You use the Software, the Warranty Statement shall be deemed to be modified consistent with such local law but to the maximum extent enforceable in such jurisdiction. Under such local law, certain limitations may not apply, and You may have additional rights which vary from jurisdiction to jurisdiction. For example, some states in the United States and some jurisdictions outside the United States may: (i) preclude the disclaimers and limitations of this Warranty Statement from limiting the rights of a consumer; (ii) otherwise restrict the ability of a manufacturer to make such disclaimers or to impose such limitations; or (iii) grant the consumer additional legal rights, specify the duration of implied warranties which the manufacturer cannot disclaim, or prohibit limitations on how long an implied warranty lasts.

\n\n

IN NO EVENT AND UNDER NO LEGAL THEORY, INCLUDING WITHOUT LIMITATION, TORT, CONTRACT, OR STRICT PRODUCTS LIABILITY, SHALL MCCI OR ANY OF ITS SUPPLIERS BE LIABLE TO YOU OR ANY OTHER PERSON FOR ANY PERSONAL INJURY, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES OF ANY KIND, INCLUDING WITHOUT LIMITATION, DAMAGES FOR LOSS OF GOODWILL, WORK STOPPAGE, COMPUTER MALFUNCTION, OR ANY OTHER KIND OF COMMERCIAL DAMAGE, EVEN IF MCCI HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. IN NO EVENT SHALL MCCI BE LIABLE FOR DAMAGES IN EXCESS OF THE AMOUNT PAID BY YOU TO MCCI OR THIS SOFTWARE LICENSE. SOME JURISDICTIONS DO NOT ALLOW THE LIMITATION OF LIABILITY FOR PERSONAL INJURY, OR OF INCIDENTAL OR CONSEQUENTIAL DAMAGES, SO THIS LIMITATION MAY NOT APPLY TO YOU. In no event shall MCCI's total liability to You for all damages (other than as may be required by applicable law in cases involving personal injury) exceed the amount of fifty dollars ($50.00). The foregoing limitations will apply even if the above stated remedy fails of its essential purpose.

\n\n

EXPORT CONTROLS. You may not use or otherwise export the Software except as authorized by United States law and the laws of the jurisdiction in which the Software was obtained. In particular, but without limitation, the Software or underlying information or technology may not be exported or re-exported (i) into (or to a national or resident of) Cuba, Libya, North Korea, Iran, Syria or any other country to which the United States has embargoed goods; or (ii) to anyone on the U.S. Treasury Department's list of Specially Designated Nationals, the U.S. Commerce Department's Table of Denial Orders, or the U.S. Department of Commerce Denied Person's List or Entity List. By downloading, ordering or using the Software, You agree to the foregoing and represent that You are not located in, under the control of, or a national or resident of any such country or on any such list. You also agree that you will not use the Software for any purposes prohibited by United States law, including, without limitation, the development, design, manufacture or production of missiles, or nuclear, chemical or biological weapons.

\n\n

MISCELLANEOUS. This Agreement constitutes the entire understanding of the parties with respect to the subject matter of this Agreement and merges all prior communications, representations, and agreements. This Agreement may be modified only by a written agreement signed by the parties. If any provision of this Agreement is held to be unenforceable for any reason, such provision shall be reformed only to the extent necessary to make it enforceable. This Agreement shall be construed under the laws of the State of New York, USA, excluding rules regarding conflicts of law. The application of the United Nations Convention of Contracts for the International Sale of Goods is expressly excluded.

\n\n

UNITED STATES GOVERNMENT USE. MCCI represents that the Software and its documentation were developed at private expense and no part of same is in the public domain. The Software is Commercial Computer Software provided with RESTRICTED RIGHTS under the Federal Acquisition Regulations and agency supplements to them. Use, duplication, or disclosure by the U.S. Government is subject to the restrictions as set forth in the Rights in Technical Data and Computer Software clause at DFAR 252.227-7013 et. seq. or the Commercial Computer Software Restricted Rights at DFAR 52.227-19, as applicable. Contractor is MCCI Corporation, 3520 Krums Corners Road, Ithaca, NY 14850, USA.

\n\n

TrueTask and MCCI are registered trademarks of MCCI Corporation.

\n\n

[00:55:42 INF] Deploying Windows\n[00:55:45 FTL] Operation failed\nSystem.InvalidOperationException: A sequ\u00eancia n\u00e3o cont\u00e9m elementos de correspond\u00eancia\n em System.Linq.Enumerable.FirstTSource\n em Deployer.Raspberry.RaspberryPi.d4.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Tasks.DeployWindows.d4.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Execution.ScriptRunner.d5.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Execution.ScriptRunner.d4.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Raspberry.WoaDeployer.d3.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Raspberry.Console.Program.d1.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 49\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Raspberry.Console.Program.

d__0.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 28

\n\n

Exce\u00e7\u00e3o Sem Tratamento: System.AggregateException: Um ou mais erros. ---> System.InvalidOperationException: A sequ\u00eancia n\u00e3o cont\u00e9m elementos de correspond\u00eancia\n em System.Linq.Enumerable.FirstTSource\n em Deployer.Raspberry.RaspberryPi.d4.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Tasks.DeployWindows.d4.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Execution.ScriptRunner.d5.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Execution.ScriptRunner.d4.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Raspberry.WoaDeployer.d3.MoveNext()\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Raspberry.Console.Program.d1.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 49\n--- Fim do rastreamento de pilha do local anterior onde a exce\u00e7\u00e3o foi gerada ---\n em System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\n em System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\n em Deployer.Raspberry.Console.Program.

d0.MoveNext() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Console\\Program.cs:linha 36\n --- Fim do rastreamento de pilha de exce\u00e7\u00f5es internas ---\n em System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)\n em System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken)\n em Deployer.Raspberry.Gui.App.<>cDisplayClass20.b0() na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Gui\\App.xaml.cs:linha 40\n em Deployer.Gui.Common.ConsoleEmbedder.ExecuteInsideConsole(Action consoleAction) na D:\\a\\1\\s\\Source\\DeployerPlatform\\Deployer.Gui.Common\\ConsoleEmbedder.cs:linha 33\n em Deployer.Raspberry.Gui.App.LaunchConsole(String[] args) na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Gui\\App.xaml.cs:linha 40\n em Deployer.Raspberry.Gui.App.OnStartup(StartupEventArgs e) na D:\\a\\1\\s\\Source\\Deployer.Raspberry.Gui\\App.xaml.cs:linha 21\n em System.Windows.Application.<.ctor>b10(Object unused)\n em System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)\n em System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler)\n em System.Windows.Threading.DispatcherOperation.InvokeImpl()\n em System.Windows.Threading.DispatcherOperation.InvokeInSecurityContext(Object state)\n em MS.Internal.CulturePreservingExecutionContext.CallbackWrapper(Object obj)\n em System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)\n em System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)\n em System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)\n em MS.Internal.CulturePreservingExecutionContext.Run(CulturePreservingExecutionContext executionContext, ContextCallback callback, Object state)\n em System.Windows.Threading.DispatcherOperation.Invoke()\n em System.Windows.Threading.Dispatcher.ProcessQueue()\n em System.Windows.Threading.Dispatcher.WndProcHook(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)\n em MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)\n em MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o)\n em System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)\n em System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler)\n em System.Windows.Threading.Dispatcher.LegacyInvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs)\n em MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam)\n em MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg)\n em System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame)\n em System.Windows.Threading.Dispatcher.PushFrame(DispatcherFrame frame)\n em System.Windows.Application.RunDispatcher(Object ignore)\n em System.Windows.Application.RunInternal(Window window)\n em System.Windows.Application.Run(Window window)\n em Deployer.Raspberry.Gui.App.Main()\n```

\n","meta":{"source":"GitHub","url":"https://github.com/WOA-Project/WOA-Deployer-Rpi/issues/47"},"_input_hash":-65080771,"_task_hash":-1163994179,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Explain how to activate license somewhere","meta":{"source":"GitHub","url":"https://github.com/jakob/Postico/issues/404"},"label":"DOCUMENTATION","_input_hash":-316057038,"_task_hash":1707638224,"answer":"accept"} {"text":"Tests check that allowed use of `void` is an error","meta":{"source":"GitHub","url":"https://github.com/dart-lang/co19/issues/120"},"label":"DOCUMENTATION","_input_hash":-1075442477,"_task_hash":-1933390923,"answer":"reject"} {"text":"http.webdav doesn't work on armv7 ","meta":{"source":"GitHub","url":"https://github.com/hacdias/caddy-webdav/issues/1"},"label":"DOCUMENTATION","_input_hash":-139921243,"_task_hash":566778584,"answer":"reject"} {"text":"Add HTML legend support","meta":{"source":"GitHub","url":"https://github.com/dc-js/dc.js/issues/1325"},"label":"DOCUMENTATION","_input_hash":437657139,"_task_hash":-1647141299,"answer":"reject"} {"text":"# [build] Improve N0110 build/flash mechanism\n\n#### Describe the bug\r\nThere is no obvious and easy way to build and flash the firmware with just basic tooling (i.e. dfu-util and a USB cable, not a ST-Link).\r\n\r\n#### To Reproduce\r\nThis is what I had to type to flash a custom firmware for the N0110 through NumWorks's DFU:\r\n```\r\nmake EPSILON_DEVICE_BENCH=0 EPSILON_USB_DFU_XIP=0 EPSILON_ONBOARDING_APP=1 EPSILON_BOOT_PROMPT=update build/device/n0110/epsilon_two_binaries -j\r\nsudo dfu-util -D build/device/n0110/epsilon.internal.bin -s 0x08000000\r\nsudo dfu-util -D build/device/n0110/epsilon.external.bin -s 0x90000000\r\n```\r\n\r\n#### Expected behavior\r\n- [ ] Flashing a new firmware for the N0100 through dfu-util should be a short one-liner.\r\n- [ ] Official documentation should be updated (https://www.numworks.com/resources/engineering/software/sdk/).\r\n\r\n#### While I'm here\r\nI haven't looked too deeply into that yet, but I'm not sure there is a way to reflash from scratch (STM's DFU) both internal and external Flash with unmodified sources and dfu-util. The stock dfu-util utility refuses to flash at a location that is not described in the DFU string descriptor, so it won't let me download/upload the recovery to RAM on both STM's DFU and NumWorks's DFU.","title":"[build] Improve N0110 build/flash mechanism","body":"#### Describe the bug\r\nThere is no obvious and easy way to build and flash the firmware with just basic tooling (i.e. dfu-util and a USB cable, not a ST-Link).\r\n\r\n#### To Reproduce\r\nThis is what I had to type to flash a custom firmware for the N0110 through NumWorks's DFU:\r\n```\r\nmake EPSILON_DEVICE_BENCH=0 EPSILON_USB_DFU_XIP=0 EPSILON_ONBOARDING_APP=1 EPSILON_BOOT_PROMPT=update build/device/n0110/epsilon_two_binaries -j\r\nsudo dfu-util -D build/device/n0110/epsilon.internal.bin -s 0x08000000\r\nsudo dfu-util -D build/device/n0110/epsilon.external.bin -s 0x90000000\r\n```\r\n\r\n#### Expected behavior\r\n- [ ] Flashing a new firmware for the N0100 through dfu-util should be a short one-liner.\r\n- [ ] Official documentation should be updated (https://www.numworks.com/resources/engineering/software/sdk/).\r\n\r\n#### While I'm here\r\nI haven't looked too deeply into that yet, but I'm not sure there is a way to reflash from scratch (STM's DFU) both internal and external Flash with unmodified sources and dfu-util. The stock dfu-util utility refuses to flash at a location that is not described in the DFU string descriptor, so it won't let me download/upload the recovery to RAM on both STM's DFU and NumWorks's DFU.","html":"

[build] Improve N0110 build/flash mechanism

\n\n

Describe the bug

\n\n

There is no obvious and easy way to build and flash the firmware with just basic tooling (i.e. dfu-util and a USB cable, not a ST-Link).

\n\n

To Reproduce

\n\n

This is what I had to type to flash a custom firmware for the N0110 through NumWorks's DFU:\n\nmake EPSILON_DEVICE_BENCH=0 EPSILON_USB_DFU_XIP=0 EPSILON_ONBOARDING_APP=1 EPSILON_BOOT_PROMPT=update build/device/n0110/epsilon_two_binaries -j\nsudo dfu-util -D build/device/n0110/epsilon.internal.bin -s 0x08000000\nsudo dfu-util -D build/device/n0110/epsilon.external.bin -s 0x90000000\n

\n\n

Expected behavior

\n\n
    \n
  • [ ] Flashing a new firmware for the N0100 through dfu-util should be a short one-liner.
  • \n
  • [ ] Official documentation should be updated (https://www.numworks.com/resources/engineering/software/sdk/).
  • \n
\n\n

While I'm here

\n\n

I haven't looked too deeply into that yet, but I'm not sure there is a way to reflash from scratch (STM's DFU) both internal and external Flash with unmodified sources and dfu-util. The stock dfu-util utility refuses to flash at a location that is not described in the DFU string descriptor, so it won't let me download/upload the recovery to RAM on both STM's DFU and NumWorks's DFU.

\n","meta":{"source":"GitHub","url":"https://github.com/numworks/epsilon/issues/1052"},"_input_hash":-1406259303,"_task_hash":525729160,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Migrate from imp to importlib","meta":{"source":"GitHub","url":"https://github.com/micheles/plac/issues/24"},"label":"DOCUMENTATION","_input_hash":-1176530595,"_task_hash":303362642,"answer":"reject"} {"text":"End-user bug report","meta":{"source":"GitHub","url":"https://github.com/andybotting/xbmc-addon-abc-iview/issues/3032"},"label":"DOCUMENTATION","_input_hash":-243505783,"_task_hash":-1790827436,"answer":"reject"} {"text":"xmllint error when building docs","meta":{"source":"GitHub","url":"https://github.com/akrennmair/newsbeuter/issues/586"},"label":"DOCUMENTATION","_input_hash":-1784293327,"_task_hash":1983255132,"answer":"reject"} {"text":"# signingSecret overrides custom receiver\n\n### Description\r\n\r\nHi! I'm new to using Bolt and Slack in general.\r\nBeen scratching my head over why my custom receiver was not working, and when I looked at the source code in `src/App.tsx` I noticed this:\r\n\r\n```\r\nif (signingSecret !== undefined) {\r\n this.receiver = new ExpressReceiver({ signingSecret, endpoints });\t this.receiver = new ExpressReceiver({ signingSecret, logger, endpoints });\r\n } else if (receiver === undefined) {\t } \r\n```\r\n\r\nSo I commented out the SigningSecret that I was previously passing and it worked!\r\n\r\nI think the application should either not overwrite the passed receiver, or log a warning in this scenario.\r\n\r\n### What type of issue is this? (place an `x` in one of the `[ ]`)\r\n- [X] bug\r\n- [ ] enhancement (feature request)\r\n- [ ] question\r\n- [X] documentation related\r\n- [ ] testing related\r\n- [ ] discussion\r\n\r\n### Requirements (place an `x` in each of the `[ ]`)\r\n* [X] I've read and understood the [Contributing guidelines](https://github.com/slackapi/bolt/blob/master/.github/contributing.md) and have done my best effort to follow them.\r\n* [X] I've read and agree to the [Code of Conduct](https://slackhq.github.io/code-of-conduct).\r\n* [X] I've searched for any related issues and avoided creating a duplicate issue.\r\n\r\n---\r\n\r\n### Bug Report\r\n\r\nFilling out the following details about bugs will help us solve your issue sooner.\r\n\r\n#### Reproducible in:\r\n\r\npackage version: 1.2.0\r\n\r\nnode version: 12.6.0\r\n\r\nOS version(s): Windows 10 Pro\r\n\r\n#### Steps to reproduce:\r\n\r\n1. Create a custom receiver class\r\n2. Instantiate an App with a custom receiver and signing key\r\n3. Emit an event with the receiver\r\n\r\n#### Expected result:\r\n\r\nBolt doesn't overwrite my receiver with the ExpressReceiver\r\n\r\n#### Actual result:\r\n\r\nBolt overwrites my receiver with the ExpressReceiver\r\n\r\n#### Attachments:\r\n\r\nhttps://github.com/slackapi/bolt/blob/522e70b381cf3d18a88b7ca271dcfb4f0ce1be9b/src/App.ts#L161","title":"signingSecret overrides custom receiver","body":"### Description\r\n\r\nHi! I'm new to using Bolt and Slack in general.\r\nBeen scratching my head over why my custom receiver was not working, and when I looked at the source code in `src/App.tsx` I noticed this:\r\n\r\n```\r\nif (signingSecret !== undefined) {\r\n this.receiver = new ExpressReceiver({ signingSecret, endpoints });\t this.receiver = new ExpressReceiver({ signingSecret, logger, endpoints });\r\n } else if (receiver === undefined) {\t } \r\n```\r\n\r\nSo I commented out the SigningSecret that I was previously passing and it worked!\r\n\r\nI think the application should either not overwrite the passed receiver, or log a warning in this scenario.\r\n\r\n### What type of issue is this? (place an `x` in one of the `[ ]`)\r\n- [X] bug\r\n- [ ] enhancement (feature request)\r\n- [ ] question\r\n- [X] documentation related\r\n- [ ] testing related\r\n- [ ] discussion\r\n\r\n### Requirements (place an `x` in each of the `[ ]`)\r\n* [X] I've read and understood the [Contributing guidelines](https://github.com/slackapi/bolt/blob/master/.github/contributing.md) and have done my best effort to follow them.\r\n* [X] I've read and agree to the [Code of Conduct](https://slackhq.github.io/code-of-conduct).\r\n* [X] I've searched for any related issues and avoided creating a duplicate issue.\r\n\r\n---\r\n\r\n### Bug Report\r\n\r\nFilling out the following details about bugs will help us solve your issue sooner.\r\n\r\n#### Reproducible in:\r\n\r\npackage version: 1.2.0\r\n\r\nnode version: 12.6.0\r\n\r\nOS version(s): Windows 10 Pro\r\n\r\n#### Steps to reproduce:\r\n\r\n1. Create a custom receiver class\r\n2. Instantiate an App with a custom receiver and signing key\r\n3. Emit an event with the receiver\r\n\r\n#### Expected result:\r\n\r\nBolt doesn't overwrite my receiver with the ExpressReceiver\r\n\r\n#### Actual result:\r\n\r\nBolt overwrites my receiver with the ExpressReceiver\r\n\r\n#### Attachments:\r\n\r\nhttps://github.com/slackapi/bolt/blob/522e70b381cf3d18a88b7ca271dcfb4f0ce1be9b/src/App.ts#L161","html":"

signingSecret overrides custom receiver

\n\n

Description

\n\n

Hi! I'm new to using Bolt and Slack in general.\nBeen scratching my head over why my custom receiver was not working, and when I looked at the source code in src/App.tsx I noticed this:

\n\n

\nif (signingSecret !== undefined) {\n this.receiver = new ExpressReceiver({ signingSecret, endpoints }); this.receiver = new ExpressReceiver({ signingSecret, logger, endpoints });\n } else if (receiver === undefined) { } \n

\n\n

So I commented out the SigningSecret that I was previously passing and it worked!

\n\n

I think the application should either not overwrite the passed receiver, or log a warning in this scenario.

\n\n

What type of issue is this? (place an x in one of the [ ])

\n\n
    \n
  • [X] bug
  • \n
  • [ ] enhancement (feature request)
  • \n
  • [ ] question
  • \n
  • [X] documentation related
  • \n
  • [ ] testing related
  • \n
  • [ ] discussion
  • \n
\n\n

Requirements (place an x in each of the [ ])

\n\n
    \n
  • [X] I've read and understood the Contributing guidelines and have done my best effort to follow them.
  • \n
  • [X] I've read and agree to the Code of Conduct.
  • \n
  • [X] I've searched for any related issues and avoided creating a duplicate issue.
  • \n
\n\n
\n\n

Bug Report

\n\n

Filling out the following details about bugs will help us solve your issue sooner.

\n\n

Reproducible in:

\n\n

package version: 1.2.0

\n\n

node version: 12.6.0

\n\n

OS version(s): Windows 10 Pro

\n\n

Steps to reproduce:

\n\n
    \n
  1. Create a custom receiver class
  2. \n
  3. Instantiate an App with a custom receiver and signing key
  4. \n
  5. Emit an event with the receiver
  6. \n
\n\n

Expected result:

\n\n

Bolt doesn't overwrite my receiver with the ExpressReceiver

\n\n

Actual result:

\n\n

Bolt overwrites my receiver with the ExpressReceiver

\n\n

Attachments:

\n\n

https://github.com/slackapi/bolt/blob/522e70b381cf3d18a88b7ca271dcfb4f0ce1be9b/src/App.ts#L161

\n","meta":{"source":"GitHub","url":"https://github.com/slackapi/bolt/issues/235"},"_input_hash":-441576874,"_task_hash":1030560232,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Id generation ","meta":{"source":"GitHub","url":"https://github.com/mkaminaga/system_library/issues/14"},"label":"DOCUMENTATION","_input_hash":2097932431,"_task_hash":679829990,"answer":"reject"} {"text":"readme images","meta":{"source":"GitHub","url":"https://github.com/facn2/Alpha.HAML/issues/82"},"label":"DOCUMENTATION","_input_hash":-1767038703,"_task_hash":-938541124,"answer":"accept"} {"text":"Outdated README?","meta":{"source":"GitHub","url":"https://github.com/BurntSushi/quickcheck/issues/178"},"label":"DOCUMENTATION","_input_hash":-1953366822,"_task_hash":404282943,"answer":"accept"} {"text":"Update home page with developer tool info","meta":{"source":"GitHub","url":"https://github.com/aaronpk/IndieAuth.com/issues/165"},"label":"DOCUMENTATION","_input_hash":474552291,"_task_hash":2039949720,"answer":"accept"} {"text":"Add license badge to readme","meta":{"source":"GitHub","url":"https://github.com/sa7mon/press/issues/1"},"label":"DOCUMENTATION","_input_hash":-845841011,"_task_hash":-1204159237,"answer":"accept"} {"text":"could you share how to install is plugin?","meta":{"source":"GitHub","url":"https://github.com/joaoasrosa/nppxmltreeview/issues/21"},"label":"DOCUMENTATION","_input_hash":-1387501328,"_task_hash":1468435517,"answer":"accept"} {"text":"Language starter specs","meta":{"source":"GitHub","url":"https://github.com/zach-king/Kraken/issues/9"},"label":"DOCUMENTATION","_input_hash":1487707965,"_task_hash":1044208943,"answer":"accept"} {"text":"Documentation or example of creating an object with a Generic Relationship?","meta":{"source":"GitHub","url":"https://github.com/django-json-api/django-rest-framework-json-api/issues/370"},"label":"DOCUMENTATION","_input_hash":1018102888,"_task_hash":-1842976589,"answer":"accept"} {"text":"License?","meta":{"source":"GitHub","url":"https://github.com/ruippeixotog/think-bayes-scala/issues/8"},"label":"DOCUMENTATION","_input_hash":-524685639,"_task_hash":-171005256,"answer":"accept"} {"text":"Setting up Heroku Server documentation","meta":{"source":"GitHub","url":"https://github.com/dmhacker/dmhacker-youtube/issues/1"},"label":"DOCUMENTATION","_input_hash":1070157828,"_task_hash":-63440854,"answer":"accept"} {"text":"Update documentation","meta":{"source":"GitHub","url":"https://github.com/EngSoc-IT-Team/QTap/issues/99"},"label":"DOCUMENTATION","_input_hash":1961376554,"_task_hash":914666225,"answer":"accept"} {"text":"# SqlSetup: Localized string TestFailedAfterSet should reference link to check bootstrap logs\n\nText should be added to tell to look for reported errors from the setup.exe in the logs. Reference the article https://docs.microsoft.com/en-us/sql/database-engine/install-windows/view-and-read-sql-server-setup-log-files for help.\r\n\r\nhttps://github.com/PowerShell/SqlServerDsc/blob/2337c7cbaa9c47d5bf82802ec77167922ec892b6/DSCResources/MSFT_SqlSetup/en-US/MSFT_SqlSetup.strings.psd1#L58\r\n","title":"SqlSetup: Localized string TestFailedAfterSet should reference link to check bootstrap logs","body":"Text should be added to tell to look for reported errors from the setup.exe in the logs. Reference the article https://docs.microsoft.com/en-us/sql/database-engine/install-windows/view-and-read-sql-server-setup-log-files for help.\r\n\r\nhttps://github.com/PowerShell/SqlServerDsc/blob/2337c7cbaa9c47d5bf82802ec77167922ec892b6/DSCResources/MSFT_SqlSetup/en-US/MSFT_SqlSetup.strings.psd1#L58\r\n","html":"

SqlSetup: Localized string TestFailedAfterSet should reference link to check bootstrap logs

\n\n

Text should be added to tell to look for reported errors from the setup.exe in the logs. Reference the article https://docs.microsoft.com/en-us/sql/database-engine/install-windows/view-and-read-sql-server-setup-log-files for help.

\n\n

https://github.com/PowerShell/SqlServerDsc/blob/2337c7cbaa9c47d5bf82802ec77167922ec892b6/DSCResources/MSFTSqlSetup/en-US/MSFTSqlSetup.strings.psd1#L58

\n","meta":{"source":"GitHub","url":"https://github.com/PowerShell/SqlServerDsc/issues/1420"},"_input_hash":-683031885,"_task_hash":-268321631,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# absda\n\n# Azure documentation issue guidance\r\n\r\nThanks for opening an issue in the Azure technical documentation repository. \r\n\r\nWe use GitHub issues as the primary channel for customer and community feedback about the Azure documentation.\r\n\r\n## Creating an issue\r\n\r\nWe prefer that you create documentation feedback issues using the Feedback link on the published article - the feedback control on the doc page creates an issue that contains all the article details so you can focus on the feedback part.\r\n\r\nYou can also create a feedback issue here in the repo. If you do this, please make sure your issue lists:\r\n\r\n- [ ] The relevant Azure service or technology. \r\n- [ ] A link to the published documentation article that you have feedback about.\r\n- [ ] Clear, specific feedback that the author can act on.\r\n\r\n## Pull requests and article contributions\r\n\r\nIf you know the change that is needed in an article, we encourage you to submit the changes directly using a pull request. If the change is large, or if you want to contribute an entire article, follow these guidelines:\r\n\r\n- [ ] Don't surprise us with a big pull request or a pull request with a new article! Submit an issue that describes the details of the proposed large change or new article. \r\n- [ ] Include the service or technology area.\r\n\r\nWe'll route the issue to the appropriate content team for review and discussion.\r\n\r\n## Tech support and product feedback\r\nIf you would like to contact Microsoft about other things, such as product feedback or tech support, please review these guidelines:\r\n\r\n- If you need technical support using Azure, the paid and free support options are described here: https://azure.microsoft.com/support/options/.\r\n\r\n- Each article in the Azure technical documentation contains a product feedback button - it's best to submit product feedback directly from a relevant article. Otherwise, you can submit product feedback for most Azure products in the following product feedback forum: https://feedback.azure.com/forums/34192--general-feedback.\r\n","title":"absda","body":"# Azure documentation issue guidance\r\n\r\nThanks for opening an issue in the Azure technical documentation repository. \r\n\r\nWe use GitHub issues as the primary channel for customer and community feedback about the Azure documentation.\r\n\r\n## Creating an issue\r\n\r\nWe prefer that you create documentation feedback issues using the Feedback link on the published article - the feedback control on the doc page creates an issue that contains all the article details so you can focus on the feedback part.\r\n\r\nYou can also create a feedback issue here in the repo. If you do this, please make sure your issue lists:\r\n\r\n- [ ] The relevant Azure service or technology. \r\n- [ ] A link to the published documentation article that you have feedback about.\r\n- [ ] Clear, specific feedback that the author can act on.\r\n\r\n## Pull requests and article contributions\r\n\r\nIf you know the change that is needed in an article, we encourage you to submit the changes directly using a pull request. If the change is large, or if you want to contribute an entire article, follow these guidelines:\r\n\r\n- [ ] Don't surprise us with a big pull request or a pull request with a new article! Submit an issue that describes the details of the proposed large change or new article. \r\n- [ ] Include the service or technology area.\r\n\r\nWe'll route the issue to the appropriate content team for review and discussion.\r\n\r\n## Tech support and product feedback\r\nIf you would like to contact Microsoft about other things, such as product feedback or tech support, please review these guidelines:\r\n\r\n- If you need technical support using Azure, the paid and free support options are described here: https://azure.microsoft.com/support/options/.\r\n\r\n- Each article in the Azure technical documentation contains a product feedback button - it's best to submit product feedback directly from a relevant article. Otherwise, you can submit product feedback for most Azure products in the following product feedback forum: https://feedback.azure.com/forums/34192--general-feedback.\r\n","html":"

absda

\n\n

Azure documentation issue guidance

\n\n

Thanks for opening an issue in the Azure technical documentation repository.

\n\n

We use GitHub issues as the primary channel for customer and community feedback about the Azure documentation.

\n\n

Creating an issue

\n\n

We prefer that you create documentation feedback issues using the Feedback link on the published article - the feedback control on the doc page creates an issue that contains all the article details so you can focus on the feedback part.

\n\n

You can also create a feedback issue here in the repo. If you do this, please make sure your issue lists:

\n\n
    \n
  • [ ] The relevant Azure service or technology.
  • \n
  • [ ] A link to the published documentation article that you have feedback about.
  • \n
  • [ ] Clear, specific feedback that the author can act on.
  • \n
\n\n

Pull requests and article contributions

\n\n

If you know the change that is needed in an article, we encourage you to submit the changes directly using a pull request. If the change is large, or if you want to contribute an entire article, follow these guidelines:

\n\n
    \n
  • [ ] Don't surprise us with a big pull request or a pull request with a new article! Submit an issue that describes the details of the proposed large change or new article.
  • \n
  • [ ] Include the service or technology area.
  • \n
\n\n

We'll route the issue to the appropriate content team for review and discussion.

\n\n

Tech support and product feedback

\n\n

If you would like to contact Microsoft about other things, such as product feedback or tech support, please review these guidelines:

\n\n
    \n
  • If you need technical support using Azure, the paid and free support options are described here: https://azure.microsoft.com/support/options/.

  • \n
  • Each article in the Azure technical documentation contains a product feedback button - it's best to submit product feedback directly from a relevant article. Otherwise, you can submit product feedback for most Azure products in the following product feedback forum: https://feedback.azure.com/forums/34192--general-feedback.

  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/MicrosoftDocs/azure-docs/issues/36864"},"_input_hash":-1484997050,"_task_hash":215132115,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# Does esdoc-typescript-plugin allow me to hand write JSDoc/TSDoc without types getting in the way?\n\nI have \"mixin classes\" (class-factory mixins), but I just want to document them as regular classes, and use things like `@mixes` (or similar) to describe what they inherit from. I don't care about having to duplicate types in my comments, if it means I can have control and can shape the documentation output by hand.\r\n\r\nThe reason is, the end use case of my (mixin) classes is for them to be custom elements. The mixin functionality is only for combining classes together during implementation mostly (though a discerning intellisense user will be able to pick up on and use the mixin patterns).\r\n\r\nBasically, I have classes that can be used like this:\r\n\r\n```ts\r\nimport {Transformable} from './Transformable'\r\n\r\n// instantiate one:\r\nconst f = new Transformable\r\n\r\n// extend it like a regular class\r\nclass Foo extends Transformable {}\r\n\r\n// or mix it with other classes:\r\nclass Bar {}\r\nclass Baz extends Transformable.mixin(Bar) {}\r\n```\r\n\r\nWhere I want to document the Transformable class something like the following:\r\n\r\n```ts\r\nimport {TreeNode} from './TreeNode'\r\nimport {Sizeable} from './Sizeable'\r\nimport {Constructor, Mixin, MixinResult} from 'lowclass'\r\n\r\n/**\r\n * @class Transformable\r\n * @mixin\r\n * @mixes TreeNode\r\n * @mixes Sizeable\r\n */\r\nfunction TransformableMixin(Base: T) {\r\n\r\n // The Transformable mixin class is composed from TreeNode and Sizeable mixin classes\r\n const Parent = TreeNode.mixin(Sizeable.mixin(Constructor(Base)))\r\n\r\n class Transformable extends Parent {\r\n /**\r\n * Set the position of the Transformable.\r\n *\r\n * @property position\r\n * @memberof Transformable\r\n * @type {SomeType}\r\n */\r\n set position(newValue: any) {\r\n this._setPropertyXYZ('position', newValue)\r\n }\r\n get position(): any {\r\n return this._props.position\r\n }\r\n\r\n // ... etc ...\r\n }\r\n\r\n return Transformable as MixinResult\r\n}\r\n\r\n// this actually creates the class reference.\r\nexport const Transformable = Mixin(TransformableMixin)\r\nexport interface Transformable extends InstanceType {}\r\n```\r\n\r\nSee what I'm trying to do there?\r\n\r\nBasically, I'd like to use `@mixes` (or something) for multiple inheritance. I'd like to be able to represent this in the docs somehow (f.e. like one class with multiple arrows pointing to the other classes, or something).\r\n\r\nIn the end, a user will only use the class instances directly, and won't necessarily even need to know about the mixin functionality:\r\n\r\n```js\r\n// `mesh` inherits from Transformable, and possibly from something else.\r\nconst mesh = document.querySelector('box-mesh')\r\n\r\n// but in the end, the user reading docs just needs to know about the classes, and their inherited properties.\r\n// Under the hood the instances are composed from mixin classes, but that's not important here, and things like\r\n// TypeDoc try to document every aspect possible, including mixin machinery.\r\n\r\n// The user just needs to do this, for example:\r\nmesh.position = {y: 20}\r\n```\r\n\r\nSo I'm aiming to make the docs really simple. I really don't want to throw an HTML beginner at some TypeDoc docs (I hope you know what I mean).\r\n\r\nSeems like what I need is for some parser to parse JSDoc comments out of my TypeScript files, then I should handle the rest myself? I've had no luck with that so far.\r\n\r\nAny ideas?","title":"Does esdoc-typescript-plugin allow me to hand write JSDoc/TSDoc without types getting in the way?","body":"I have \"mixin classes\" (class-factory mixins), but I just want to document them as regular classes, and use things like `@mixes` (or similar) to describe what they inherit from. I don't care about having to duplicate types in my comments, if it means I can have control and can shape the documentation output by hand.\r\n\r\nThe reason is, the end use case of my (mixin) classes is for them to be custom elements. The mixin functionality is only for combining classes together during implementation mostly (though a discerning intellisense user will be able to pick up on and use the mixin patterns).\r\n\r\nBasically, I have classes that can be used like this:\r\n\r\n```ts\r\nimport {Transformable} from './Transformable'\r\n\r\n// instantiate one:\r\nconst f = new Transformable\r\n\r\n// extend it like a regular class\r\nclass Foo extends Transformable {}\r\n\r\n// or mix it with other classes:\r\nclass Bar {}\r\nclass Baz extends Transformable.mixin(Bar) {}\r\n```\r\n\r\nWhere I want to document the Transformable class something like the following:\r\n\r\n```ts\r\nimport {TreeNode} from './TreeNode'\r\nimport {Sizeable} from './Sizeable'\r\nimport {Constructor, Mixin, MixinResult} from 'lowclass'\r\n\r\n/**\r\n * @class Transformable\r\n * @mixin\r\n * @mixes TreeNode\r\n * @mixes Sizeable\r\n */\r\nfunction TransformableMixin(Base: T) {\r\n\r\n // The Transformable mixin class is composed from TreeNode and Sizeable mixin classes\r\n const Parent = TreeNode.mixin(Sizeable.mixin(Constructor(Base)))\r\n\r\n class Transformable extends Parent {\r\n /**\r\n * Set the position of the Transformable.\r\n *\r\n * @property position\r\n * @memberof Transformable\r\n * @type {SomeType}\r\n */\r\n set position(newValue: any) {\r\n this._setPropertyXYZ('position', newValue)\r\n }\r\n get position(): any {\r\n return this._props.position\r\n }\r\n\r\n // ... etc ...\r\n }\r\n\r\n return Transformable as MixinResult\r\n}\r\n\r\n// this actually creates the class reference.\r\nexport const Transformable = Mixin(TransformableMixin)\r\nexport interface Transformable extends InstanceType {}\r\n```\r\n\r\nSee what I'm trying to do there?\r\n\r\nBasically, I'd like to use `@mixes` (or something) for multiple inheritance. I'd like to be able to represent this in the docs somehow (f.e. like one class with multiple arrows pointing to the other classes, or something).\r\n\r\nIn the end, a user will only use the class instances directly, and won't necessarily even need to know about the mixin functionality:\r\n\r\n```js\r\n// `mesh` inherits from Transformable, and possibly from something else.\r\nconst mesh = document.querySelector('box-mesh')\r\n\r\n// but in the end, the user reading docs just needs to know about the classes, and their inherited properties.\r\n// Under the hood the instances are composed from mixin classes, but that's not important here, and things like\r\n// TypeDoc try to document every aspect possible, including mixin machinery.\r\n\r\n// The user just needs to do this, for example:\r\nmesh.position = {y: 20}\r\n```\r\n\r\nSo I'm aiming to make the docs really simple. I really don't want to throw an HTML beginner at some TypeDoc docs (I hope you know what I mean).\r\n\r\nSeems like what I need is for some parser to parse JSDoc comments out of my TypeScript files, then I should handle the rest myself? I've had no luck with that so far.\r\n\r\nAny ideas?","html":"

Does esdoc-typescript-plugin allow me to hand write JSDoc/TSDoc without types getting in the way?

\n\n

I have \"mixin classes\" (class-factory mixins), but I just want to document them as regular classes, and use things like @mixes (or similar) to describe what they inherit from. I don't care about having to duplicate types in my comments, if it means I can have control and can shape the documentation output by hand.

\n\n

The reason is, the end use case of my (mixin) classes is for them to be custom elements. The mixin functionality is only for combining classes together during implementation mostly (though a discerning intellisense user will be able to pick up on and use the mixin patterns).

\n\n

Basically, I have classes that can be used like this:

\n\n

```ts\nimport {Transformable} from './Transformable'

\n\n

// instantiate one:\nconst f = new Transformable

\n\n

// extend it like a regular class\nclass Foo extends Transformable {}

\n\n

// or mix it with other classes:\nclass Bar {}\nclass Baz extends Transformable.mixin(Bar) {}\n```

\n\n

Where I want to document the Transformable class something like the following:

\n\n

```ts\nimport {TreeNode} from './TreeNode'\nimport {Sizeable} from './Sizeable'\nimport {Constructor, Mixin, MixinResult} from 'lowclass'

\n\n

/**\n * @class Transformable\n * @mixin\n * @mixes TreeNode\n * @mixes Sizeable\n */\nfunction TransformableMixin(Base: T) {

\n\n
// The Transformable mixin class is composed from TreeNode and Sizeable mixin classes\nconst Parent = TreeNode.mixin(Sizeable.mixin(Constructor(Base)))\n\nclass Transformable extends Parent {\n    /**\n     * Set the position of the Transformable.\n     *\n     * @property position\n     * @memberof Transformable\n     * @type {SomeType}\n     */\n    set position(newValue: any) {\n        this._setPropertyXYZ<Transformable, TransformProp>('position', newValue)\n    }\n    get position(): any {\n        return this._props.position\n    }\n\n    // ... etc ...\n}\n\nreturn Transformable as MixinResult<typeof Transformable, T>\n
\n\n

}

\n\n

// this actually creates the class reference.\nexport const Transformable = Mixin(TransformableMixin)\nexport interface Transformable extends InstanceType {}\n```

\n\n

See what I'm trying to do there?

\n\n

Basically, I'd like to use @mixes (or something) for multiple inheritance. I'd like to be able to represent this in the docs somehow (f.e. like one class with multiple arrows pointing to the other classes, or something).

\n\n

In the end, a user will only use the class instances directly, and won't necessarily even need to know about the mixin functionality:

\n\n

``js\n//mesh` inherits from Transformable, and possibly from something else.\nconst mesh = document.querySelector('box-mesh')

\n\n

// but in the end, the user reading docs just needs to know about the classes, and their inherited properties.\n// Under the hood the instances are composed from mixin classes, but that's not important here, and things like\n// TypeDoc try to document every aspect possible, including mixin machinery.

\n\n

// The user just needs to do this, for example:\nmesh.position = {y: 20}\n```

\n\n

So I'm aiming to make the docs really simple. I really don't want to throw an HTML beginner at some TypeDoc docs (I hope you know what I mean).

\n\n

Seems like what I need is for some parser to parse JSDoc comments out of my TypeScript files, then I should handle the rest myself? I've had no luck with that so far.

\n\n

Any ideas?

\n","meta":{"source":"GitHub","url":"https://github.com/esdoc/esdoc-plugins/issues/90"},"_input_hash":1058571255,"_task_hash":-1779658288,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"There is a bug when you click the number in the center of the dialog.","meta":{"source":"GitHub","url":"https://github.com/ch-muhammad-adil/Android-Material-Picker-Dialog/issues/1"},"label":"DOCUMENTATION","_input_hash":396124217,"_task_hash":1934669342,"answer":"reject"} {"text":"Dashes in headings","meta":{"source":"GitHub","url":"https://github.com/code-warrior/web-design-at-risd--summer-2017--assignment-1/issues/1"},"label":"DOCUMENTATION","_input_hash":1516508491,"_task_hash":768464026,"answer":"reject"} {"text":"# Having 404 page in different languages\n\nI see in gatsby-node.js there is this condition that avoids generating multilingual pages for the 404 page. \r\n\r\n```\r\n// Only create one 404 page at /404.html\r\n if (page.path.includes('404')) {\r\n return\r\n }\r\n```\r\n\r\nBut running `gatsby build` generates the following error. \r\n```\r\nBuilding static HTML failed for path \"/404/\"\r\n\r\nSee our docs page for more info on this error: https://gatsby.dev/debug-html\r\n\r\n\r\n 21 | \r\n 22 | \r\n> 23 | {i18n.schedule}\r\n | ^\r\n 24 | \r\n 25 | \r\n 26 |\r\n\r\n\r\n WebpackError: TypeError: Cannot read property 'schedule' of undefined\r\n```\r\n\r\nI guess this is happening because no `locale` is defined so that the `i18n` translations are not available. \r\n\r\nIs there any possibility to make the 404 pages translatable in every language? \r\n\r\nDoes anyone have a nice approach to solving this? ","title":"Having 404 page in different languages","body":"I see in gatsby-node.js there is this condition that avoids generating multilingual pages for the 404 page. \r\n\r\n```\r\n// Only create one 404 page at /404.html\r\n if (page.path.includes('404')) {\r\n return\r\n }\r\n```\r\n\r\nBut running `gatsby build` generates the following error. \r\n```\r\nBuilding static HTML failed for path \"/404/\"\r\n\r\nSee our docs page for more info on this error: https://gatsby.dev/debug-html\r\n\r\n\r\n 21 | \r\n 22 | \r\n> 23 | {i18n.schedule}\r\n | ^\r\n 24 | \r\n 25 | \r\n 26 |\r\n\r\n\r\n WebpackError: TypeError: Cannot read property 'schedule' of undefined\r\n```\r\n\r\nI guess this is happening because no `locale` is defined so that the `i18n` translations are not available. \r\n\r\nIs there any possibility to make the 404 pages translatable in every language? \r\n\r\nDoes anyone have a nice approach to solving this? ","html":"

Having 404 page in different languages

\n\n

I see in gatsby-node.js there is this condition that avoids generating multilingual pages for the 404 page.

\n\n

\n// Only create one 404 page at /404.html\n if (page.path.includes('404')) {\n return\n }\n

\n\n

But running gatsby build generates the following error. \n```\nBuilding static HTML failed for path \"/404/\"

\n\n

See our docs page for more info on this error: https://gatsby.dev/debug-html

\n\n

21 | \n 22 |

\n\n
\n

23 | {i18n.schedule}\n | ^\n 24 | \n 25 | \n 26 |

\n
\n\n

WebpackError: TypeError: Cannot read property 'schedule' of undefined\n```

\n\n

I guess this is happening because no locale is defined so that the i18n translations are not available.

\n\n

Is there any possibility to make the 404 pages translatable in every language?

\n\n

Does anyone have a nice approach to solving this?

\n","meta":{"source":"GitHub","url":"https://github.com/LekoArts/gatsby-starter-prismic-i18n/issues/77"},"_input_hash":1723541531,"_task_hash":-990445688,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"NITRO Compilation under Windos 7 and Visual Studio 2010","meta":{"source":"GitHub","url":"https://github.com/mdaus/nitro/issues/40"},"label":"DOCUMENTATION","_input_hash":1256718719,"_task_hash":1042456792,"answer":"reject"} {"text":"# Exclude expired assets from search results\n\n**Question?**\r\n* What piece of functionality do you have a question about?\r\n* What are you trying to achieve?\r\n\r\nDocumentation can be found here: https://adobe-marketing-cloud.github.io/asset-share-commons/. It might answer your question!\r\n\r\n\r\nHi ,\r\n\r\nI have same kind of requirement where i expired assets should not show up on landing page(home page) but when i will search them through property filter then expired assets should display with other assets.\r\nTo achieve this i have excluded expired assets from search restrictions but now the issue i am facing is that i am not even able to search them through search filter.\r\nHow should i achieve this scenario?\r\nWill hidden filter help me out in this cas\r\n\r\nThanks,\r\nLovepreet","title":"Exclude expired assets from search results","body":"**Question?**\r\n* What piece of functionality do you have a question about?\r\n* What are you trying to achieve?\r\n\r\nDocumentation can be found here: https://adobe-marketing-cloud.github.io/asset-share-commons/. It might answer your question!\r\n\r\n\r\nHi ,\r\n\r\nI have same kind of requirement where i expired assets should not show up on landing page(home page) but when i will search them through property filter then expired assets should display with other assets.\r\nTo achieve this i have excluded expired assets from search restrictions but now the issue i am facing is that i am not even able to search them through search filter.\r\nHow should i achieve this scenario?\r\nWill hidden filter help me out in this cas\r\n\r\nThanks,\r\nLovepreet","html":"

Exclude expired assets from search results

\n\n

Question?\n* What piece of functionality do you have a question about?\n* What are you trying to achieve?

\n\n

Documentation can be found here: https://adobe-marketing-cloud.github.io/asset-share-commons/. It might answer your question!

\n\n

Hi ,

\n\n

I have same kind of requirement where i expired assets should not show up on landing page(home page) but when i will search them through property filter then expired assets should display with other assets.\nTo achieve this i have excluded expired assets from search restrictions but now the issue i am facing is that i am not even able to search them through search filter.\nHow should i achieve this scenario?\nWill hidden filter help me out in this cas

\n\n

Thanks,\nLovepreet

\n","meta":{"source":"GitHub","url":"https://github.com/Adobe-Marketing-Cloud/asset-share-commons/issues/440"},"_input_hash":-1031948705,"_task_hash":-494417662,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Documentation missing for Selenium Html Runner","meta":{"source":"GitHub","url":"https://github.com/SeleniumHQ/selenium/issues/4379"},"label":"DOCUMENTATION","_input_hash":1189078933,"_task_hash":991125040,"answer":"accept"} {"text":"# Minecraft Version Field doesn't have any options.\n\n\r\n\r\n Solder Repo Hash:\r\n\r\n Operating System and version: Fedora 30 (server)\r\n\r\n PHP version: 7.3.7\r\n\r\n\r\n Composer version: 1.9.0\r\n\r\n\r\n Server type: \r\n\r\n Type of database: SQLite\r\n\r\n\r\n Type of hosting: Personal Shared\r\n\r\n\r\n Link to the affected install's public url: solder.binaryaura.net/\r\n\r\n repo.binaryaura.net/solder\r\n---------------------------------------------------------\r\n\r\n\r\nI'm not sure if I'm doing something in the wrong order, or I've put something in the wrong place, but the field 'Minecraft Version' the field below has no options to choose. I was give more information but, I'm getting no errors and I have no documentation to point me in the right direction. Any direction at all would be appreciated.\r\nThank-you.\r\n","title":"Minecraft Version Field doesn't have any options.","body":"\r\n\r\n Solder Repo Hash:\r\n\r\n Operating System and version: Fedora 30 (server)\r\n\r\n PHP version: 7.3.7\r\n\r\n\r\n Composer version: 1.9.0\r\n\r\n\r\n Server type: \r\n\r\n Type of database: SQLite\r\n\r\n\r\n Type of hosting: Personal Shared\r\n\r\n\r\n Link to the affected install's public url: solder.binaryaura.net/\r\n\r\n repo.binaryaura.net/solder\r\n---------------------------------------------------------\r\n\r\n\r\nI'm not sure if I'm doing something in the wrong order, or I've put something in the wrong place, but the field 'Minecraft Version' the field below has no options to choose. I was give more information but, I'm getting no errors and I have no documentation to point me in the right direction. Any direction at all would be appreciated.\r\nThank-you.\r\n","html":"

Minecraft Version Field doesn't have any options.

\n\n\n\n

Solder Repo Hash:

\n\n

Operating System and version: Fedora 30 (server)

\n\n

PHP version: 7.3.7

\n\n

\n Composer version: 1.9.0

\n\n

\n Server type:

\n\n

Type of database: SQLite

\n\n

\n Type of hosting: Personal Shared

\n\n

\n Link to the affected install's public url: solder.binaryaura.net/

\n\n

repo.binaryaura.net/solder

\n\n

\nI'm not sure if I'm doing something in the wrong order, or I've put something in the wrong place, but the field 'Minecraft Version' the field below has no options to choose. I was give more information but, I'm getting no errors and I have no documentation to point me in the right direction. Any direction at all would be appreciated.\nThank-you.

\n","meta":{"source":"GitHub","url":"https://github.com/TechnicPack/TechnicSolder/issues/676"},"_input_hash":-950183906,"_task_hash":903770437,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Problem connecting to Acquia DevDesktop local site","meta":{"source":"GitHub","url":"https://github.com/felixfbecker/vscode-php-debug/issues/181"},"label":"DOCUMENTATION","_input_hash":-480515938,"_task_hash":-367925525,"answer":"reject"} {"text":"python3 Question","meta":{"source":"GitHub","url":"https://github.com/lbl-srg/BuildingsPy/issues/169"},"label":"DOCUMENTATION","_input_hash":-246781622,"_task_hash":597331010,"answer":"reject"} {"text":"Allow rendering of choral page even if there is no data","meta":{"source":"GitHub","url":"https://github.com/ChoralCloud/ChoralWeb/issues/116"},"label":"DOCUMENTATION","_input_hash":1978807162,"_task_hash":-332509074,"answer":"reject"} {"text":"# Several tests fail on Windows with 0.4.0\n\nTest log:\r\n```\r\n============================= test session starts =============================\r\nplatform win32 -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0\r\nrootdir: C:\\w\\2\\s\\packaging\\windows\\vision\r\ncollected 187 items\r\n\r\ntest\\test_backbone_utils.py .. [ 1%]\r\ntest\\test_cpp_models.py FFFFF..FFFF........FFFFFFFFFF.. [ 17%]\r\ntest\\test_datasets.py ..F...... [ 22%]\r\ntest\\test_datasets_transforms.py .. [ 23%]\r\ntest\\test_datasets_utils.py .....FFF. [ 28%]\r\ntest\\test_datasets_video_utils.py ..FFss [ 31%]\r\ntest\\test_io.py .FFFFF [ 34%]\r\ntest\\test_models.py ................................................ [ 60%]\r\ntest\\test_ops.py ..s..s.s.s.s.s.s.s.s [ 71%]\r\ntest\\test_transforms.py ..........sss................................... [ 96%]\r\n.. [ 97%]\r\ntest\\test_utils.py ..FF [100%]\r\n\r\n================================== FAILURES ===================================\r\n_____________________________ Tester.test_alexnet _____________________________\r\n\r\nself = \r\n\r\n def test_alexnet(self):\r\n> process_model(models.alexnet(self.pretrained), self.image, _C_tests.forward_alexnet, 'Alexnet')\r\n\r\ntest\\test_cpp_models.py:43: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = AlexNet(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(11, 11), stride=(4, 4), padding=(2, 2))\r\n (1)...ures=4096, bias=True)\r\n (5): ReLU(inplace=True)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Alexnet'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_densenet121 ___________________________\r\n\r\nself = \r\n\r\n def test_densenet121(self):\r\n> process_model(models.densenet121(self.pretrained), self.image, _C_tests.forward_densenet121, 'Densenet121')\r\n\r\ntest\\test_cpp_models.py:105: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = DenseNet(\r\n (features): Sequential(\r\n (conv0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, track_running_stats=True)\r\n )\r\n (classifier): Linear(in_features=1024, out_features=1000, bias=True)\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Densenet121'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_densenet161 ___________________________\r\n\r\nself = \r\n\r\n def test_densenet161(self):\r\n> process_model(models.densenet161(self.pretrained), self.image, _C_tests.forward_densenet161, 'Densenet161')\r\n\r\ntest\\test_cpp_models.py:114: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = DenseNet(\r\n (features): Sequential(\r\n (conv0): Conv2d(3, 96, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, track_running_stats=True)\r\n )\r\n (classifier): Linear(in_features=2208, out_features=1000, bias=True)\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Densenet161'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_densenet169 ___________________________\r\n\r\nself = \r\n\r\n def test_densenet169(self):\r\n> process_model(models.densenet169(self.pretrained), self.image, _C_tests.forward_densenet169, 'Densenet169')\r\n\r\ntest\\test_cpp_models.py:108: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = DenseNet(\r\n (features): Sequential(\r\n (conv0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, track_running_stats=True)\r\n )\r\n (classifier): Linear(in_features=1664, out_features=1000, bias=True)\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Densenet169'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_densenet201 ___________________________\r\n\r\nself = \r\n\r\n def test_densenet201(self):\r\n> process_model(models.densenet201(self.pretrained), self.image, _C_tests.forward_densenet201, 'Densenet201')\r\n\r\ntest\\test_cpp_models.py:111: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = DenseNet(\r\n (features): Sequential(\r\n (conv0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, track_running_stats=True)\r\n )\r\n (classifier): Linear(in_features=1920, out_features=1000, bias=True)\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Densenet201'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_mnasnet0_5 ____________________________\r\n\r\nself = \r\n\r\n def test_mnasnet0_5(self):\r\n> process_model(models.mnasnet0_5(self.pretrained), self.image, _C_tests.forward_mnasnet0_5, 'MNASNet0_5')\r\n\r\ntest\\test_cpp_models.py:123: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = MNASNet(\r\n (layers): Sequential(\r\n (0): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\r\n (0): Dropout(p=0.2, inplace=True)\r\n (1): Linear(in_features=1280, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'MNASNet0_5'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_mnasnet0_75 ___________________________\r\n\r\nself = \r\n\r\n def test_mnasnet0_75(self):\r\n> process_model(models.mnasnet0_75(self.pretrained), self.image, _C_tests.forward_mnasnet0_75, 'MNASNet0_75')\r\n\r\ntest\\test_cpp_models.py:126: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = MNASNet(\r\n (layers): Sequential(\r\n (0): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\r\n (0): Dropout(p=0.2, inplace=True)\r\n (1): Linear(in_features=1280, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'MNASNet0_75'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_mnasnet1_0 ____________________________\r\n\r\nself = \r\n\r\n def test_mnasnet1_0(self):\r\n> process_model(models.mnasnet1_0(self.pretrained), self.image, _C_tests.forward_mnasnet1_0, 'MNASNet1_0')\r\n\r\ntest\\test_cpp_models.py:129: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = MNASNet(\r\n (layers): Sequential(\r\n (0): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\r\n (0): Dropout(p=0.2, inplace=True)\r\n (1): Linear(in_features=1280, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'MNASNet1_0'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_mnasnet1_3 ____________________________\r\n\r\nself = \r\n\r\n def test_mnasnet1_3(self):\r\n> process_model(models.mnasnet1_3(self.pretrained), self.image, _C_tests.forward_mnasnet1_3, 'MNASNet1_3')\r\n\r\ntest\\test_cpp_models.py:132: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = MNASNet(\r\n (layers): Sequential(\r\n (0): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\r\n (0): Dropout(p=0.2, inplace=True)\r\n (1): Linear(in_features=1280, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'MNASNet1_3'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n__________________________ Tester.test_squeezenet1_0 __________________________\r\n\r\nself = \r\n\r\n def test_squeezenet1_0(self):\r\n process_model(models.squeezenet1_0(self.pretrained), self.image,\r\n> _C_tests.forward_squeezenet1_0, 'Squeezenet1.0')\r\n\r\ntest\\test_cpp_models.py:98: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = SqueezeNet(\r\n (features): Sequential(\r\n (0): Conv2d(3, 96, kernel_size=(7, 7), stride=(2, 2))\r\n (1): ReLU(inplace=...00, kernel_size=(1, 1), stride=(1, 1))\r\n (2): ReLU(inplace=True)\r\n (3): AdaptiveAvgPool2d(output_size=(1, 1))\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Squeezenet1.0'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n__________________________ Tester.test_squeezenet1_1 __________________________\r\n\r\nself = \r\n\r\n def test_squeezenet1_1(self):\r\n process_model(models.squeezenet1_1(self.pretrained), self.image,\r\n> _C_tests.forward_squeezenet1_1, 'Squeezenet1.1')\r\n\r\ntest\\test_cpp_models.py:102: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = SqueezeNet(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(2, 2))\r\n (1): ReLU(inplace=...00, kernel_size=(1, 1), stride=(1, 1))\r\n (2): ReLU(inplace=True)\r\n (3): AdaptiveAvgPool2d(output_size=(1, 1))\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Squeezenet1.1'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n______________________________ Tester.test_vgg11 ______________________________\r\n\r\nself = \r\n\r\n def test_vgg11(self):\r\n> process_model(models.vgg11(self.pretrained), self.image, _C_tests.forward_vgg11, 'VGG11')\r\n\r\ntest\\test_cpp_models.py:46: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): ReLU...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG11'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n____________________________ Tester.test_vgg11_bn _____________________________\r\n\r\nself = \r\n\r\n def test_vgg11_bn(self):\r\n> process_model(models.vgg11_bn(self.pretrained), self.image, _C_tests.forward_vgg11bn, 'VGG11BN')\r\n\r\ntest\\test_cpp_models.py:58: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): Batc...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG11BN'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n______________________________ Tester.test_vgg13 ______________________________\r\n\r\nself = \r\n\r\n def test_vgg13(self):\r\n> process_model(models.vgg13(self.pretrained), self.image, _C_tests.forward_vgg13, 'VGG13')\r\n\r\ntest\\test_cpp_models.py:49: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): ReLU...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG13'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n____________________________ Tester.test_vgg13_bn _____________________________\r\n\r\nself = \r\n\r\n def test_vgg13_bn(self):\r\n> process_model(models.vgg13_bn(self.pretrained), self.image, _C_tests.forward_vgg13bn, 'VGG13BN')\r\n\r\ntest\\test_cpp_models.py:61: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): Batc...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG13BN'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n______________________________ Tester.test_vgg16 ______________________________\r\n\r\nself = \r\n\r\n def test_vgg16(self):\r\n> process_model(models.vgg16(self.pretrained), self.image, _C_tests.forward_vgg16, 'VGG16')\r\n\r\ntest\\test_cpp_models.py:52: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): ReLU...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG16'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n____________________________ Tester.test_vgg16_bn _____________________________\r\n\r\nself = \r\n\r\n def test_vgg16_bn(self):\r\n> process_model(models.vgg16_bn(self.pretrained), self.image, _C_tests.forward_vgg16bn, 'VGG16BN')\r\n\r\ntest\\test_cpp_models.py:64: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): Batc...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG16BN'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n______________________________ Tester.test_vgg19 ______________________________\r\n\r\nself = \r\n\r\n def test_vgg19(self):\r\n> process_model(models.vgg19(self.pretrained), self.image, _C_tests.forward_vgg19, 'VGG19')\r\n\r\ntest\\test_cpp_models.py:55: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): ReLU...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG19'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n____________________________ Tester.test_vgg19_bn _____________________________\r\n\r\nself = \r\n\r\n def test_vgg19_bn(self):\r\n> process_model(models.vgg19_bn(self.pretrained), self.image, _C_tests.forward_vgg19bn, 'VGG19BN')\r\n\r\ntest\\test_cpp_models.py:67: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): Batc...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG19BN'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_cityscapes ____________________________\r\n\r\nself = \r\n\r\n def test_cityscapes(self):\r\n with cityscapes_root() as root:\r\n \r\n for mode in ['coarse', 'fine']:\r\n \r\n if mode == 'coarse':\r\n splits = ['train', 'train_extra', 'val']\r\n else:\r\n splits = ['train', 'val', 'test']\r\n \r\n for split in splits:\r\n for target_type in ['semantic', 'instance']:\r\n dataset = torchvision.datasets.Cityscapes(root, split=split,\r\n target_type=target_type, mode=mode)\r\n self.generic_segmentation_dataset_test(dataset, num_images=2)\r\n \r\n color_dataset = torchvision.datasets.Cityscapes(root, split=split,\r\n target_type='color', mode=mode)\r\n color_img, color_target = color_dataset[0]\r\n self.assertTrue(isinstance(color_img, PIL.Image.Image))\r\n self.assertTrue(np.array(color_target).shape[2] == 4)\r\n \r\n polygon_dataset = torchvision.datasets.Cityscapes(root, split=split,\r\n target_type='polygon', mode=mode)\r\n polygon_img, polygon_target = polygon_dataset[0]\r\n self.assertTrue(isinstance(polygon_img, PIL.Image.Image))\r\n self.assertTrue(isinstance(polygon_target, dict))\r\n self.assertTrue(isinstance(polygon_target['imgHeight'], int))\r\n self.assertTrue(isinstance(polygon_target['objects'], list))\r\n \r\n # Test multiple target types\r\n targets_combo = ['semantic', 'polygon', 'color']\r\n multiple_types_dataset = torchvision.datasets.Cityscapes(root, split=split,\r\n target_type=targets_combo,\r\n mode=mode)\r\n output = multiple_types_dataset[0]\r\n self.assertTrue(isinstance(output, tuple))\r\n self.assertTrue(len(output) == 2)\r\n self.assertTrue(isinstance(output[0], PIL.Image.Image))\r\n self.assertTrue(isinstance(output[1], tuple))\r\n self.assertTrue(len(output[1]) == 3)\r\n self.assertTrue(isinstance(output[1][0], PIL.Image.Image)) # semantic\r\n self.assertTrue(isinstance(output[1][1], dict)) # polygon\r\n> self.assertTrue(isinstance(output[1][2], PIL.Image.Image)) # color\r\n\r\ntest\\test_datasets.py:195: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:119: in __exit__\r\n next(self.gen)\r\ntest\\fakedata_generation.py:243: in cityscapes_root\r\n yield tmp_dir\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:119: in __exit__\r\n next(self.gen)\r\ntest\\common_utils.py:16: in get_tmp_dir\r\n shutil.rmtree(tmp_dir)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:516: in rmtree\r\n return _rmtree_unsafe(path, onerror)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:395: in _rmtree_unsafe\r\n _rmtree_unsafe(fullname, onerror)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:395: in _rmtree_unsafe\r\n _rmtree_unsafe(fullname, onerror)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:395: in _rmtree_unsafe\r\n _rmtree_unsafe(fullname, onerror)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:400: in _rmtree_unsafe\r\n onerror(os.unlink, fullname, sys.exc_info())\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\npath = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmp5etnebcf\\\\gtFine\\\\test\\\\bochum'\r\nonerror = .onerror at 0x000000323F3BFDC8>\r\n\r\n def _rmtree_unsafe(path, onerror):\r\n try:\r\n with os.scandir(path) as scandir_it:\r\n entries = list(scandir_it)\r\n except OSError:\r\n onerror(os.scandir, path, sys.exc_info())\r\n entries = []\r\n for entry in entries:\r\n fullname = entry.path\r\n try:\r\n is_dir = entry.is_dir(follow_symlinks=False)\r\n except OSError:\r\n is_dir = False\r\n if is_dir:\r\n try:\r\n if entry.is_symlink():\r\n # This can only happen if someone replaces\r\n # a directory with a symlink after the call to\r\n # os.scandir or entry.is_dir above.\r\n raise OSError(\"Cannot call rmtree on a symbolic link\")\r\n except OSError:\r\n onerror(os.path.islink, fullname, sys.exc_info())\r\n continue\r\n _rmtree_unsafe(fullname, onerror)\r\n else:\r\n try:\r\n> os.unlink(fullname)\r\nE PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmp5etnebcf\\\\gtFine\\\\test\\\\bochum\\\\bochum_000000_000000_gtFine_color.png'\r\n\r\n..\\conda\\envs\\py37\\lib\\shutil.py:398: PermissionError\r\n__________________________ Tester.test_extract_gzip ___________________________\r\n\r\nself = \r\n\r\n def test_extract_gzip(self):\r\n with get_tmp_dir() as temp_dir:\r\n with tempfile.NamedTemporaryFile(suffix='.gz') as f:\r\n> with gzip.GzipFile(f.name, 'wb') as zf:\r\n\r\ntest\\test_datasets_utils.py:101: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = <[AttributeError(\"'GzipFile' object has no attribute 'fileobj'\") raised in repr()] GzipFile object at 0x32007d28c8>\r\nfilename = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpc1wq6shu.gz'\r\nmode = 'wb', compresslevel = 9, fileobj = None, mtime = None\r\n\r\n def __init__(self, filename=None, mode=None,\r\n compresslevel=9, fileobj=None, mtime=None):\r\n \"\"\"Constructor for the GzipFile class.\r\n \r\n At least one of fileobj and filename must be given a\r\n non-trivial value.\r\n \r\n The new class instance is based on fileobj, which can be a regular\r\n file, an io.BytesIO object, or any other object which simulates a file.\r\n It defaults to None, in which case filename is opened to provide\r\n a file object.\r\n \r\n When fileobj is not None, the filename argument is only used to be\r\n included in the gzip file header, which may include the original\r\n filename of the uncompressed file. It defaults to the filename of\r\n fileobj, if discernible; otherwise, it defaults to the empty string,\r\n and in this case the original filename is not included in the header.\r\n \r\n The mode argument can be any of 'r', 'rb', 'a', 'ab', 'w', 'wb', 'x', or\r\n 'xb' depending on whether the file will be read or written. The default\r\n is the mode of fileobj if discernible; otherwise, the default is 'rb'.\r\n A mode of 'r' is equivalent to one of 'rb', and similarly for 'w' and\r\n 'wb', 'a' and 'ab', and 'x' and 'xb'.\r\n \r\n The compresslevel argument is an integer from 0 to 9 controlling the\r\n level of compression; 1 is fastest and produces the least compression,\r\n and 9 is slowest and produces the most compression. 0 is no compression\r\n at all. The default is 9.\r\n \r\n The mtime argument is an optional numeric timestamp to be written\r\n to the last modification time field in the stream when compressing.\r\n If omitted or None, the current time is used.\r\n \r\n \"\"\"\r\n \r\n if mode and ('t' in mode or 'U' in mode):\r\n raise ValueError(\"Invalid mode: {!r}\".format(mode))\r\n if mode and 'b' not in mode:\r\n mode += 'b'\r\n if fileobj is None:\r\n> fileobj = self.myfileobj = builtins.open(filename, mode or 'rb')\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpc1wq6shu.gz'\r\n\r\n..\\conda\\envs\\py37\\lib\\gzip.py:163: PermissionError\r\n___________________________ Tester.test_extract_tar ___________________________\r\n\r\nself = \r\n\r\n def test_extract_tar(self):\r\n for ext, mode in zip(['.tar', '.tar.gz'], ['w', 'w:gz']):\r\n with get_tmp_dir() as temp_dir:\r\n with tempfile.NamedTemporaryFile() as bf:\r\n bf.write(\"this is the content\".encode())\r\n bf.seek(0)\r\n with tempfile.NamedTemporaryFile(suffix=ext) as f:\r\n> with tarfile.open(f.name, mode=mode) as zf:\r\n\r\ntest\\test_datasets_utils.py:90: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\tarfile.py:1611: in open\r\n return cls.taropen(name, mode, fileobj, **kwargs)\r\n..\\conda\\envs\\py37\\lib\\tarfile.py:1621: in taropen\r\n return cls(name, mode, fileobj, **kwargs)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = \r\nname = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmplby3znrd.tar', mode = 'w'\r\nfileobj = None, format = None, tarinfo = None, dereference = None\r\nignore_zeros = None, encoding = None, errors = 'surrogateescape'\r\npax_headers = None, debug = None, errorlevel = None, copybufsize = None\r\n\r\n def __init__(self, name=None, mode=\"r\", fileobj=None, format=None,\r\n tarinfo=None, dereference=None, ignore_zeros=None, encoding=None,\r\n errors=\"surrogateescape\", pax_headers=None, debug=None,\r\n errorlevel=None, copybufsize=None):\r\n \"\"\"Open an (uncompressed) tar archive `name'. `mode' is either 'r' to\r\n read from an existing archive, 'a' to append data to an existing\r\n file or 'w' to create a new file overwriting an existing one. `mode'\r\n defaults to 'r'.\r\n If `fileobj' is given, it is used for reading or writing data. If it\r\n can be determined, `mode' is overridden by `fileobj's mode.\r\n `fileobj' is not closed, when TarFile is closed.\r\n \"\"\"\r\n modes = {\"r\": \"rb\", \"a\": \"r+b\", \"w\": \"wb\", \"x\": \"xb\"}\r\n if mode not in modes:\r\n raise ValueError(\"mode must be 'r', 'a', 'w' or 'x'\")\r\n self.mode = mode\r\n self._mode = modes[mode]\r\n \r\n if not fileobj:\r\n if self.mode == \"a\" and not os.path.exists(name):\r\n # Create nonexistent files in append mode.\r\n self.mode = \"w\"\r\n self._mode = \"wb\"\r\n> fileobj = bltn_open(name, self._mode)\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmplby3znrd.tar'\r\n\r\n..\\conda\\envs\\py37\\lib\\tarfile.py:1436: PermissionError\r\n___________________________ Tester.test_extract_zip ___________________________\r\n\r\nself = \r\n\r\n def test_extract_zip(self):\r\n with get_tmp_dir() as temp_dir:\r\n with tempfile.NamedTemporaryFile(suffix='.zip') as f:\r\n with zipfile.ZipFile(f, 'w') as zf:\r\n zf.writestr('file.tst', 'this is the content')\r\n> utils.extract_archive(f.name, temp_dir)\r\n\r\ntest\\test_datasets_utils.py:77: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\utils.py:231: in extract_archive\r\n with zipfile.ZipFile(from_path, 'r') as z:\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = \r\nfile = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpiwmc4x4z.zip', mode = 'r'\r\ncompression = 0, allowZip64 = True, compresslevel = None\r\n\r\n def __init__(self, file, mode=\"r\", compression=ZIP_STORED, allowZip64=True,\r\n compresslevel=None):\r\n \"\"\"Open the ZIP file with mode read 'r', write 'w', exclusive create 'x',\r\n or append 'a'.\"\"\"\r\n if mode not in ('r', 'w', 'x', 'a'):\r\n raise ValueError(\"ZipFile requires mode 'r', 'w', 'x', or 'a'\")\r\n \r\n _check_compression(compression)\r\n \r\n self._allowZip64 = allowZip64\r\n self._didModify = False\r\n self.debug = 0 # Level of printing: 0 through 3\r\n self.NameToInfo = {} # Find file info given name\r\n self.filelist = [] # List of ZipInfo instances for archive\r\n self.compression = compression # Method of compression\r\n self.compresslevel = compresslevel\r\n self.mode = mode\r\n self.pwd = None\r\n self._comment = b''\r\n \r\n # Check if we were passed a file-like object\r\n if isinstance(file, os.PathLike):\r\n file = os.fspath(file)\r\n if isinstance(file, str):\r\n # No, it's a filename\r\n self._filePassed = 0\r\n self.filename = file\r\n modeDict = {'r' : 'rb', 'w': 'w+b', 'x': 'x+b', 'a' : 'r+b',\r\n 'r+b': 'w+b', 'w+b': 'wb', 'x+b': 'xb'}\r\n filemode = modeDict[mode]\r\n while True:\r\n try:\r\n> self.fp = io.open(file, filemode)\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpiwmc4x4z.zip'\r\n\r\n..\\conda\\envs\\py37\\lib\\zipfile.py:1207: PermissionError\r\n___________________________ Tester.test_video_clips ___________________________\r\n\r\nself = \r\n\r\n def test_video_clips(self):\r\n with get_list_of_videos(num_videos=3) as video_list:\r\n> video_clips = VideoClips(video_list, 5, 5)\r\n\r\ntest\\test_datasets_video_utils.py:62: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\video_utils.py:55: in __init__\r\n self._compute_frame_pts()\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\video_utils.py:84: in _compute_frame_pts\r\n for batch in dl:\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:278: in __iter__\r\n return _MultiProcessingDataLoaderIter(self)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:682: in __init__\r\n w.start()\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\process.py:112: in start\r\n self._popen = self._Popen(self)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:223: in _Popen\r\n return _default_context.get_context().Process._Popen(process_obj)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:322: in _Popen\r\n return Popen(process_obj)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\popen_spawn_win32.py:89: in __init__\r\n reduction.dump(process_obj, to_child)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nobj = , file = <_io.BufferedWriter name=10>\r\nprotocol = None\r\n\r\n def dump(obj, file, protocol=None):\r\n '''Replacement for pickle.dump() using ForkingPickler.'''\r\n> ForkingPickler(file, protocol).dump(obj)\r\nE AttributeError: Can't pickle local object 'VideoClips._compute_frame_pts..DS'\r\n\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\reduction.py:60: AttributeError\r\n---------------------------- Captured stderr call -----------------------------\r\n\r\n_____________________ Tester.test_video_clips_custom_fps ______________________\r\n\r\nself = \r\n\r\n def test_video_clips_custom_fps(self):\r\n with get_list_of_videos(num_videos=3, sizes=[12, 12, 12], fps=[3, 4, 6]) as video_list:\r\n num_frames = 4\r\n for fps in [1, 3, 4, 10]:\r\n> video_clips = VideoClips(video_list, num_frames, num_frames, fps)\r\n\r\ntest\\test_datasets_video_utils.py:117: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\video_utils.py:55: in __init__\r\n self._compute_frame_pts()\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\video_utils.py:84: in _compute_frame_pts\r\n for batch in dl:\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:278: in __iter__\r\n return _MultiProcessingDataLoaderIter(self)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:682: in __init__\r\n w.start()\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\process.py:112: in start\r\n self._popen = self._Popen(self)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:223: in _Popen\r\n return _default_context.get_context().Process._Popen(process_obj)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:322: in _Popen\r\n return Popen(process_obj)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\popen_spawn_win32.py:89: in __init__\r\n reduction.dump(process_obj, to_child)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nobj = , file = <_io.BufferedWriter name=10>\r\nprotocol = None\r\n\r\n def dump(obj, file, protocol=None):\r\n '''Replacement for pickle.dump() using ForkingPickler.'''\r\n> ForkingPickler(file, protocol).dump(obj)\r\nE AttributeError: Can't pickle local object 'VideoClips._compute_frame_pts..DS'\r\n\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\reduction.py:60: AttributeError\r\n---------------------------- Captured stderr call -----------------------------\r\nTraceback (most recent call last):\r\n\r\n File \"\", line 1, in \r\n\r\n File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 105, in spawn_main\r\n\r\n exitcode = _main(fd)\r\n\r\n File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 115, in _main\r\n\r\n self = reduction.pickle.load(from_parent)\r\n\r\nEOFError: Ran out of input\r\n\r\n\r\n-------------------------- Captured stderr teardown ---------------------------\r\nTraceback (most recent call last):\r\n\r\n File \"\", line 1, in \r\n\r\n File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 105, in spawn_main\r\n\r\n exitcode = _main(fd)\r\n\r\n File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 115, in _main\r\n\r\n self = reduction.pickle.load(from_parent)\r\n\r\nEOFError: Ran out of input\r\n\r\n_______________________ Tester.test_read_partial_video ________________________\r\n\r\nself = \r\n\r\n def test_read_partial_video(self):\r\n> with temp_video(10, 300, 300, 5, lossless=True) as (f_name, data):\r\n\r\ntest\\test_io.py:84: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n___________________ Tester.test_read_partial_video_bframes ____________________\r\n\r\nself = \r\n\r\n def test_read_partial_video_bframes(self):\r\n # do not use lossless encoding, to test the presence of B-frames\r\n options = {'bframes': '16', 'keyint': '10', 'min-keyint': '4'}\r\n> with temp_video(100, 300, 300, 5, options=options) as (f_name, data):\r\n\r\ntest\\test_io.py:100: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n_________________________ Tester.test_read_timestamps _________________________\r\n\r\nself = \r\n\r\n def test_read_timestamps(self):\r\n> with temp_video(10, 300, 300, 5) as (f_name, data):\r\n\r\ntest\\test_io.py:69: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:59: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n___________________ Tester.test_read_timestamps_from_packet ___________________\r\n\r\nself = \r\n\r\n def test_read_timestamps_from_packet(self):\r\n> with temp_video(10, 300, 300, 5, video_codec='mpeg4') as (f_name, data):\r\n\r\ntest\\test_io.py:129: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n________________________ Tester.test_write_read_video _________________________\r\n\r\nself = \r\n\r\n def test_write_read_video(self):\r\n> with temp_video(10, 300, 300, 5, lossless=True) as (f_name, data):\r\n\r\ntest\\test_io.py:62: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n___________________________ Tester.test_save_image ____________________________\r\n\r\nself = \r\n\r\n def test_save_image(self):\r\n with tempfile.NamedTemporaryFile(suffix='.png') as f:\r\n t = torch.rand(2, 3, 64, 64)\r\n> utils.save_image(t, f.name)\r\n\r\ntest\\test_utils.py:43: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\utils.py:105: in save_image\r\n im.save(filename)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = \r\nfp = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpm0s9rq8o.png'\r\nformat = 'PNG', params = {}\r\nfilename = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpm0s9rq8o.png'\r\nopen_fp = True, save_all = False, ext = '.png'\r\nsave_handler = \r\n\r\n def save(self, fp, format=None, **params):\r\n \"\"\"\r\n Saves this image under the given filename. If no format is\r\n specified, the format to use is determined from the filename\r\n extension, if possible.\r\n \r\n Keyword options can be used to provide additional instructions\r\n to the writer. If a writer doesn't recognise an option, it is\r\n silently ignored. The available options are described in the\r\n :doc:`image format documentation\r\n <../handbook/image-file-formats>` for each writer.\r\n \r\n You can use a file object instead of a filename. In this case,\r\n you must always specify the format. The file object must\r\n implement the ``seek``, ``tell``, and ``write``\r\n methods, and be opened in binary mode.\r\n \r\n :param fp: A filename (string), pathlib.Path object or file object.\r\n :param format: Optional format override. If omitted, the\r\n format to use is determined from the filename extension.\r\n If a file object was used instead of a filename, this\r\n parameter should always be used.\r\n :param params: Extra parameters to the image writer.\r\n :returns: None\r\n :exception ValueError: If the output format could not be determined\r\n from the file name. Use the format option to solve this.\r\n :exception IOError: If the file could not be written. The file\r\n may have been created, and may contain partial data.\r\n \"\"\"\r\n \r\n filename = \"\"\r\n open_fp = False\r\n if isPath(fp):\r\n filename = fp\r\n open_fp = True\r\n elif HAS_PATHLIB and isinstance(fp, Path):\r\n filename = str(fp)\r\n open_fp = True\r\n if not filename and hasattr(fp, \"name\") and isPath(fp.name):\r\n # only set the name for metadata purposes\r\n filename = fp.name\r\n \r\n # may mutate self!\r\n self._ensure_mutable()\r\n \r\n save_all = params.pop(\"save_all\", False)\r\n self.encoderinfo = params\r\n self.encoderconfig = ()\r\n \r\n preinit()\r\n \r\n ext = os.path.splitext(filename)[1].lower()\r\n \r\n if not format:\r\n if ext not in EXTENSION:\r\n init()\r\n try:\r\n format = EXTENSION[ext]\r\n except KeyError:\r\n raise ValueError(\"unknown file extension: {}\".format(ext))\r\n \r\n if format.upper() not in SAVE:\r\n init()\r\n if save_all:\r\n save_handler = SAVE_ALL[format.upper()]\r\n else:\r\n save_handler = SAVE[format.upper()]\r\n \r\n if open_fp:\r\n if params.get(\"append\", False):\r\n fp = builtins.open(filename, \"r+b\")\r\n else:\r\n # Open also for reading (\"+\"), because TIFF save_all\r\n # writer needs to go back and edit the written data.\r\n> fp = builtins.open(filename, \"w+b\")\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpm0s9rq8o.png'\r\n\r\n..\\conda\\envs\\py37\\lib\\site-packages\\PIL\\Image.py:2085: PermissionError\r\n_____________________ Tester.test_save_image_single_pixel _____________________\r\n\r\nself = \r\n\r\n def test_save_image_single_pixel(self):\r\n with tempfile.NamedTemporaryFile(suffix='.png') as f:\r\n t = torch.rand(1, 3, 1, 1)\r\n> utils.save_image(t, f.name)\r\n\r\ntest\\test_utils.py:49: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\utils.py:105: in save_image\r\n im.save(filename)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = \r\nfp = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpn5hd8b_0.png'\r\nformat = 'PNG', params = {}\r\nfilename = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpn5hd8b_0.png'\r\nopen_fp = True, save_all = False, ext = '.png'\r\nsave_handler = \r\n\r\n def save(self, fp, format=None, **params):\r\n \"\"\"\r\n Saves this image under the given filename. If no format is\r\n specified, the format to use is determined from the filename\r\n extension, if possible.\r\n \r\n Keyword options can be used to provide additional instructions\r\n to the writer. If a writer doesn't recognise an option, it is\r\n silently ignored. The available options are described in the\r\n :doc:`image format documentation\r\n <../handbook/image-file-formats>` for each writer.\r\n \r\n You can use a file object instead of a filename. In this case,\r\n you must always specify the format. The file object must\r\n implement the ``seek``, ``tell``, and ``write``\r\n methods, and be opened in binary mode.\r\n \r\n :param fp: A filename (string), pathlib.Path object or file object.\r\n :param format: Optional format override. If omitted, the\r\n format to use is determined from the filename extension.\r\n If a file object was used instead of a filename, this\r\n parameter should always be used.\r\n :param params: Extra parameters to the image writer.\r\n :returns: None\r\n :exception ValueError: If the output format could not be determined\r\n from the file name. Use the format option to solve this.\r\n :exception IOError: If the file could not be written. The file\r\n may have been created, and may contain partial data.\r\n \"\"\"\r\n \r\n filename = \"\"\r\n open_fp = False\r\n if isPath(fp):\r\n filename = fp\r\n open_fp = True\r\n elif HAS_PATHLIB and isinstance(fp, Path):\r\n filename = str(fp)\r\n open_fp = True\r\n if not filename and hasattr(fp, \"name\") and isPath(fp.name):\r\n # only set the name for metadata purposes\r\n filename = fp.name\r\n \r\n # may mutate self!\r\n self._ensure_mutable()\r\n \r\n save_all = params.pop(\"save_all\", False)\r\n self.encoderinfo = params\r\n self.encoderconfig = ()\r\n \r\n preinit()\r\n \r\n ext = os.path.splitext(filename)[1].lower()\r\n \r\n if not format:\r\n if ext not in EXTENSION:\r\n init()\r\n try:\r\n format = EXTENSION[ext]\r\n except KeyError:\r\n raise ValueError(\"unknown file extension: {}\".format(ext))\r\n \r\n if format.upper() not in SAVE:\r\n init()\r\n if save_all:\r\n save_handler = SAVE_ALL[format.upper()]\r\n else:\r\n save_handler = SAVE[format.upper()]\r\n \r\n if open_fp:\r\n if params.get(\"append\", False):\r\n fp = builtins.open(filename, \"r+b\")\r\n else:\r\n # Open also for reading (\"+\"), because TIFF save_all\r\n # writer needs to go back and edit the written data.\r\n> fp = builtins.open(filename, \"w+b\")\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpn5hd8b_0.png'\r\n\r\n..\\conda\\envs\\py37\\lib\\site-packages\\PIL\\Image.py:2085: PermissionError\r\n============================== warnings summary ===============================\r\nc:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\lsun.py:8\r\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\lsun.py:8: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working\r\n from collections import Iterable\r\n\r\nc:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\av\\container\\__init__.py:1\r\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\av\\container\\__init__.py:1: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working\r\n from .core import Container, open\r\n\r\ntest/test_datasets.py::Tester::test_imagenet\r\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\importlib\\_bootstrap.py:219: RuntimeWarning: numpy.ufunc size changed, may indicate binary incompatibility. Expected 192 from C header, got 216 from PyObject\r\n return f(*args, **kwds)\r\n\r\ntest/test_transforms.py::Tester::test_randomperspective\r\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\transforms\\functional.py:440: UserWarning: torch.gels is deprecated in favour of torch.lstsq and will be removed in the next release. Please use torch.lstsq instead.\r\n res = torch.gels(B, A)[0]\r\n\r\n-- Docs: https://docs.pytest.org/en/latest/warnings.html\r\n======= 32 failed, 141 passed, 14 skipped, 4 warnings in 407.68 seconds =======\r\n```","title":"Several tests fail on Windows with 0.4.0","body":"Test log:\r\n```\r\n============================= test session starts =============================\r\nplatform win32 -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0\r\nrootdir: C:\\w\\2\\s\\packaging\\windows\\vision\r\ncollected 187 items\r\n\r\ntest\\test_backbone_utils.py .. [ 1%]\r\ntest\\test_cpp_models.py FFFFF..FFFF........FFFFFFFFFF.. [ 17%]\r\ntest\\test_datasets.py ..F...... [ 22%]\r\ntest\\test_datasets_transforms.py .. [ 23%]\r\ntest\\test_datasets_utils.py .....FFF. [ 28%]\r\ntest\\test_datasets_video_utils.py ..FFss [ 31%]\r\ntest\\test_io.py .FFFFF [ 34%]\r\ntest\\test_models.py ................................................ [ 60%]\r\ntest\\test_ops.py ..s..s.s.s.s.s.s.s.s [ 71%]\r\ntest\\test_transforms.py ..........sss................................... [ 96%]\r\n.. [ 97%]\r\ntest\\test_utils.py ..FF [100%]\r\n\r\n================================== FAILURES ===================================\r\n_____________________________ Tester.test_alexnet _____________________________\r\n\r\nself = \r\n\r\n def test_alexnet(self):\r\n> process_model(models.alexnet(self.pretrained), self.image, _C_tests.forward_alexnet, 'Alexnet')\r\n\r\ntest\\test_cpp_models.py:43: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = AlexNet(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(11, 11), stride=(4, 4), padding=(2, 2))\r\n (1)...ures=4096, bias=True)\r\n (5): ReLU(inplace=True)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Alexnet'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_densenet121 ___________________________\r\n\r\nself = \r\n\r\n def test_densenet121(self):\r\n> process_model(models.densenet121(self.pretrained), self.image, _C_tests.forward_densenet121, 'Densenet121')\r\n\r\ntest\\test_cpp_models.py:105: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = DenseNet(\r\n (features): Sequential(\r\n (conv0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, track_running_stats=True)\r\n )\r\n (classifier): Linear(in_features=1024, out_features=1000, bias=True)\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Densenet121'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_densenet161 ___________________________\r\n\r\nself = \r\n\r\n def test_densenet161(self):\r\n> process_model(models.densenet161(self.pretrained), self.image, _C_tests.forward_densenet161, 'Densenet161')\r\n\r\ntest\\test_cpp_models.py:114: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = DenseNet(\r\n (features): Sequential(\r\n (conv0): Conv2d(3, 96, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, track_running_stats=True)\r\n )\r\n (classifier): Linear(in_features=2208, out_features=1000, bias=True)\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Densenet161'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_densenet169 ___________________________\r\n\r\nself = \r\n\r\n def test_densenet169(self):\r\n> process_model(models.densenet169(self.pretrained), self.image, _C_tests.forward_densenet169, 'Densenet169')\r\n\r\ntest\\test_cpp_models.py:108: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = DenseNet(\r\n (features): Sequential(\r\n (conv0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, track_running_stats=True)\r\n )\r\n (classifier): Linear(in_features=1664, out_features=1000, bias=True)\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Densenet169'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_densenet201 ___________________________\r\n\r\nself = \r\n\r\n def test_densenet201(self):\r\n> process_model(models.densenet201(self.pretrained), self.image, _C_tests.forward_densenet201, 'Densenet201')\r\n\r\ntest\\test_cpp_models.py:111: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = DenseNet(\r\n (features): Sequential(\r\n (conv0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, track_running_stats=True)\r\n )\r\n (classifier): Linear(in_features=1920, out_features=1000, bias=True)\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Densenet201'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_mnasnet0_5 ____________________________\r\n\r\nself = \r\n\r\n def test_mnasnet0_5(self):\r\n> process_model(models.mnasnet0_5(self.pretrained), self.image, _C_tests.forward_mnasnet0_5, 'MNASNet0_5')\r\n\r\ntest\\test_cpp_models.py:123: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = MNASNet(\r\n (layers): Sequential(\r\n (0): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\r\n (0): Dropout(p=0.2, inplace=True)\r\n (1): Linear(in_features=1280, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'MNASNet0_5'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_mnasnet0_75 ___________________________\r\n\r\nself = \r\n\r\n def test_mnasnet0_75(self):\r\n> process_model(models.mnasnet0_75(self.pretrained), self.image, _C_tests.forward_mnasnet0_75, 'MNASNet0_75')\r\n\r\ntest\\test_cpp_models.py:126: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = MNASNet(\r\n (layers): Sequential(\r\n (0): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\r\n (0): Dropout(p=0.2, inplace=True)\r\n (1): Linear(in_features=1280, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'MNASNet0_75'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_mnasnet1_0 ____________________________\r\n\r\nself = \r\n\r\n def test_mnasnet1_0(self):\r\n> process_model(models.mnasnet1_0(self.pretrained), self.image, _C_tests.forward_mnasnet1_0, 'MNASNet1_0')\r\n\r\ntest\\test_cpp_models.py:129: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = MNASNet(\r\n (layers): Sequential(\r\n (0): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\r\n (0): Dropout(p=0.2, inplace=True)\r\n (1): Linear(in_features=1280, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'MNASNet1_0'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_mnasnet1_3 ____________________________\r\n\r\nself = \r\n\r\n def test_mnasnet1_3(self):\r\n> process_model(models.mnasnet1_3(self.pretrained), self.image, _C_tests.forward_mnasnet1_3, 'MNASNet1_3')\r\n\r\ntest\\test_cpp_models.py:132: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = MNASNet(\r\n (layers): Sequential(\r\n (0): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\r\n (0): Dropout(p=0.2, inplace=True)\r\n (1): Linear(in_features=1280, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'MNASNet1_3'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n__________________________ Tester.test_squeezenet1_0 __________________________\r\n\r\nself = \r\n\r\n def test_squeezenet1_0(self):\r\n process_model(models.squeezenet1_0(self.pretrained), self.image,\r\n> _C_tests.forward_squeezenet1_0, 'Squeezenet1.0')\r\n\r\ntest\\test_cpp_models.py:98: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = SqueezeNet(\r\n (features): Sequential(\r\n (0): Conv2d(3, 96, kernel_size=(7, 7), stride=(2, 2))\r\n (1): ReLU(inplace=...00, kernel_size=(1, 1), stride=(1, 1))\r\n (2): ReLU(inplace=True)\r\n (3): AdaptiveAvgPool2d(output_size=(1, 1))\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Squeezenet1.0'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n__________________________ Tester.test_squeezenet1_1 __________________________\r\n\r\nself = \r\n\r\n def test_squeezenet1_1(self):\r\n process_model(models.squeezenet1_1(self.pretrained), self.image,\r\n> _C_tests.forward_squeezenet1_1, 'Squeezenet1.1')\r\n\r\ntest\\test_cpp_models.py:102: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = SqueezeNet(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(2, 2))\r\n (1): ReLU(inplace=...00, kernel_size=(1, 1), stride=(1, 1))\r\n (2): ReLU(inplace=True)\r\n (3): AdaptiveAvgPool2d(output_size=(1, 1))\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'Squeezenet1.1'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n______________________________ Tester.test_vgg11 ______________________________\r\n\r\nself = \r\n\r\n def test_vgg11(self):\r\n> process_model(models.vgg11(self.pretrained), self.image, _C_tests.forward_vgg11, 'VGG11')\r\n\r\ntest\\test_cpp_models.py:46: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): ReLU...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG11'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n____________________________ Tester.test_vgg11_bn _____________________________\r\n\r\nself = \r\n\r\n def test_vgg11_bn(self):\r\n> process_model(models.vgg11_bn(self.pretrained), self.image, _C_tests.forward_vgg11bn, 'VGG11BN')\r\n\r\ntest\\test_cpp_models.py:58: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): Batc...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG11BN'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n______________________________ Tester.test_vgg13 ______________________________\r\n\r\nself = \r\n\r\n def test_vgg13(self):\r\n> process_model(models.vgg13(self.pretrained), self.image, _C_tests.forward_vgg13, 'VGG13')\r\n\r\ntest\\test_cpp_models.py:49: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): ReLU...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG13'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n____________________________ Tester.test_vgg13_bn _____________________________\r\n\r\nself = \r\n\r\n def test_vgg13_bn(self):\r\n> process_model(models.vgg13_bn(self.pretrained), self.image, _C_tests.forward_vgg13bn, 'VGG13BN')\r\n\r\ntest\\test_cpp_models.py:61: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): Batc...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG13BN'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n______________________________ Tester.test_vgg16 ______________________________\r\n\r\nself = \r\n\r\n def test_vgg16(self):\r\n> process_model(models.vgg16(self.pretrained), self.image, _C_tests.forward_vgg16, 'VGG16')\r\n\r\ntest\\test_cpp_models.py:52: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): ReLU...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG16'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n____________________________ Tester.test_vgg16_bn _____________________________\r\n\r\nself = \r\n\r\n def test_vgg16_bn(self):\r\n> process_model(models.vgg16_bn(self.pretrained), self.image, _C_tests.forward_vgg16bn, 'VGG16BN')\r\n\r\ntest\\test_cpp_models.py:64: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): Batc...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG16BN'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n______________________________ Tester.test_vgg19 ______________________________\r\n\r\nself = \r\n\r\n def test_vgg19(self):\r\n> process_model(models.vgg19(self.pretrained), self.image, _C_tests.forward_vgg19, 'VGG19')\r\n\r\ntest\\test_cpp_models.py:55: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): ReLU...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG19'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n____________________________ Tester.test_vgg19_bn _____________________________\r\n\r\nself = \r\n\r\n def test_vgg19_bn(self):\r\n> process_model(models.vgg19_bn(self.pretrained), self.image, _C_tests.forward_vgg19bn, 'VGG19BN')\r\n\r\ntest\\test_cpp_models.py:67: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nmodel = VGG(\r\n (features): Sequential(\r\n (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\r\n (1): Batc...lace=True)\r\n (5): Dropout(p=0.5, inplace=False)\r\n (6): Linear(in_features=4096, out_features=1000, bias=True)\r\n )\r\n)\r\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\r\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\r\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\r\nfunc = \r\nname = 'VGG19BN'\r\n\r\n def process_model(model, tensor, func, name):\r\n model.eval()\r\n traced_script_module = torch.jit.trace(model, tensor)\r\n traced_script_module.save(\"model.pt\")\r\n \r\n py_output = model.forward(tensor)\r\n> cpp_output = func(\"model.pt\", tensor)\r\nE RuntimeError: undefined Tensor (infer_is_variable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\r\nE (no backtrace available)\r\n\r\ntest\\test_cpp_models.py:16: RuntimeError\r\n___________________________ Tester.test_cityscapes ____________________________\r\n\r\nself = \r\n\r\n def test_cityscapes(self):\r\n with cityscapes_root() as root:\r\n \r\n for mode in ['coarse', 'fine']:\r\n \r\n if mode == 'coarse':\r\n splits = ['train', 'train_extra', 'val']\r\n else:\r\n splits = ['train', 'val', 'test']\r\n \r\n for split in splits:\r\n for target_type in ['semantic', 'instance']:\r\n dataset = torchvision.datasets.Cityscapes(root, split=split,\r\n target_type=target_type, mode=mode)\r\n self.generic_segmentation_dataset_test(dataset, num_images=2)\r\n \r\n color_dataset = torchvision.datasets.Cityscapes(root, split=split,\r\n target_type='color', mode=mode)\r\n color_img, color_target = color_dataset[0]\r\n self.assertTrue(isinstance(color_img, PIL.Image.Image))\r\n self.assertTrue(np.array(color_target).shape[2] == 4)\r\n \r\n polygon_dataset = torchvision.datasets.Cityscapes(root, split=split,\r\n target_type='polygon', mode=mode)\r\n polygon_img, polygon_target = polygon_dataset[0]\r\n self.assertTrue(isinstance(polygon_img, PIL.Image.Image))\r\n self.assertTrue(isinstance(polygon_target, dict))\r\n self.assertTrue(isinstance(polygon_target['imgHeight'], int))\r\n self.assertTrue(isinstance(polygon_target['objects'], list))\r\n \r\n # Test multiple target types\r\n targets_combo = ['semantic', 'polygon', 'color']\r\n multiple_types_dataset = torchvision.datasets.Cityscapes(root, split=split,\r\n target_type=targets_combo,\r\n mode=mode)\r\n output = multiple_types_dataset[0]\r\n self.assertTrue(isinstance(output, tuple))\r\n self.assertTrue(len(output) == 2)\r\n self.assertTrue(isinstance(output[0], PIL.Image.Image))\r\n self.assertTrue(isinstance(output[1], tuple))\r\n self.assertTrue(len(output[1]) == 3)\r\n self.assertTrue(isinstance(output[1][0], PIL.Image.Image)) # semantic\r\n self.assertTrue(isinstance(output[1][1], dict)) # polygon\r\n> self.assertTrue(isinstance(output[1][2], PIL.Image.Image)) # color\r\n\r\ntest\\test_datasets.py:195: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:119: in __exit__\r\n next(self.gen)\r\ntest\\fakedata_generation.py:243: in cityscapes_root\r\n yield tmp_dir\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:119: in __exit__\r\n next(self.gen)\r\ntest\\common_utils.py:16: in get_tmp_dir\r\n shutil.rmtree(tmp_dir)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:516: in rmtree\r\n return _rmtree_unsafe(path, onerror)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:395: in _rmtree_unsafe\r\n _rmtree_unsafe(fullname, onerror)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:395: in _rmtree_unsafe\r\n _rmtree_unsafe(fullname, onerror)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:395: in _rmtree_unsafe\r\n _rmtree_unsafe(fullname, onerror)\r\n..\\conda\\envs\\py37\\lib\\shutil.py:400: in _rmtree_unsafe\r\n onerror(os.unlink, fullname, sys.exc_info())\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\npath = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmp5etnebcf\\\\gtFine\\\\test\\\\bochum'\r\nonerror = .onerror at 0x000000323F3BFDC8>\r\n\r\n def _rmtree_unsafe(path, onerror):\r\n try:\r\n with os.scandir(path) as scandir_it:\r\n entries = list(scandir_it)\r\n except OSError:\r\n onerror(os.scandir, path, sys.exc_info())\r\n entries = []\r\n for entry in entries:\r\n fullname = entry.path\r\n try:\r\n is_dir = entry.is_dir(follow_symlinks=False)\r\n except OSError:\r\n is_dir = False\r\n if is_dir:\r\n try:\r\n if entry.is_symlink():\r\n # This can only happen if someone replaces\r\n # a directory with a symlink after the call to\r\n # os.scandir or entry.is_dir above.\r\n raise OSError(\"Cannot call rmtree on a symbolic link\")\r\n except OSError:\r\n onerror(os.path.islink, fullname, sys.exc_info())\r\n continue\r\n _rmtree_unsafe(fullname, onerror)\r\n else:\r\n try:\r\n> os.unlink(fullname)\r\nE PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmp5etnebcf\\\\gtFine\\\\test\\\\bochum\\\\bochum_000000_000000_gtFine_color.png'\r\n\r\n..\\conda\\envs\\py37\\lib\\shutil.py:398: PermissionError\r\n__________________________ Tester.test_extract_gzip ___________________________\r\n\r\nself = \r\n\r\n def test_extract_gzip(self):\r\n with get_tmp_dir() as temp_dir:\r\n with tempfile.NamedTemporaryFile(suffix='.gz') as f:\r\n> with gzip.GzipFile(f.name, 'wb') as zf:\r\n\r\ntest\\test_datasets_utils.py:101: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = <[AttributeError(\"'GzipFile' object has no attribute 'fileobj'\") raised in repr()] GzipFile object at 0x32007d28c8>\r\nfilename = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpc1wq6shu.gz'\r\nmode = 'wb', compresslevel = 9, fileobj = None, mtime = None\r\n\r\n def __init__(self, filename=None, mode=None,\r\n compresslevel=9, fileobj=None, mtime=None):\r\n \"\"\"Constructor for the GzipFile class.\r\n \r\n At least one of fileobj and filename must be given a\r\n non-trivial value.\r\n \r\n The new class instance is based on fileobj, which can be a regular\r\n file, an io.BytesIO object, or any other object which simulates a file.\r\n It defaults to None, in which case filename is opened to provide\r\n a file object.\r\n \r\n When fileobj is not None, the filename argument is only used to be\r\n included in the gzip file header, which may include the original\r\n filename of the uncompressed file. It defaults to the filename of\r\n fileobj, if discernible; otherwise, it defaults to the empty string,\r\n and in this case the original filename is not included in the header.\r\n \r\n The mode argument can be any of 'r', 'rb', 'a', 'ab', 'w', 'wb', 'x', or\r\n 'xb' depending on whether the file will be read or written. The default\r\n is the mode of fileobj if discernible; otherwise, the default is 'rb'.\r\n A mode of 'r' is equivalent to one of 'rb', and similarly for 'w' and\r\n 'wb', 'a' and 'ab', and 'x' and 'xb'.\r\n \r\n The compresslevel argument is an integer from 0 to 9 controlling the\r\n level of compression; 1 is fastest and produces the least compression,\r\n and 9 is slowest and produces the most compression. 0 is no compression\r\n at all. The default is 9.\r\n \r\n The mtime argument is an optional numeric timestamp to be written\r\n to the last modification time field in the stream when compressing.\r\n If omitted or None, the current time is used.\r\n \r\n \"\"\"\r\n \r\n if mode and ('t' in mode or 'U' in mode):\r\n raise ValueError(\"Invalid mode: {!r}\".format(mode))\r\n if mode and 'b' not in mode:\r\n mode += 'b'\r\n if fileobj is None:\r\n> fileobj = self.myfileobj = builtins.open(filename, mode or 'rb')\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpc1wq6shu.gz'\r\n\r\n..\\conda\\envs\\py37\\lib\\gzip.py:163: PermissionError\r\n___________________________ Tester.test_extract_tar ___________________________\r\n\r\nself = \r\n\r\n def test_extract_tar(self):\r\n for ext, mode in zip(['.tar', '.tar.gz'], ['w', 'w:gz']):\r\n with get_tmp_dir() as temp_dir:\r\n with tempfile.NamedTemporaryFile() as bf:\r\n bf.write(\"this is the content\".encode())\r\n bf.seek(0)\r\n with tempfile.NamedTemporaryFile(suffix=ext) as f:\r\n> with tarfile.open(f.name, mode=mode) as zf:\r\n\r\ntest\\test_datasets_utils.py:90: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\tarfile.py:1611: in open\r\n return cls.taropen(name, mode, fileobj, **kwargs)\r\n..\\conda\\envs\\py37\\lib\\tarfile.py:1621: in taropen\r\n return cls(name, mode, fileobj, **kwargs)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = \r\nname = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmplby3znrd.tar', mode = 'w'\r\nfileobj = None, format = None, tarinfo = None, dereference = None\r\nignore_zeros = None, encoding = None, errors = 'surrogateescape'\r\npax_headers = None, debug = None, errorlevel = None, copybufsize = None\r\n\r\n def __init__(self, name=None, mode=\"r\", fileobj=None, format=None,\r\n tarinfo=None, dereference=None, ignore_zeros=None, encoding=None,\r\n errors=\"surrogateescape\", pax_headers=None, debug=None,\r\n errorlevel=None, copybufsize=None):\r\n \"\"\"Open an (uncompressed) tar archive `name'. `mode' is either 'r' to\r\n read from an existing archive, 'a' to append data to an existing\r\n file or 'w' to create a new file overwriting an existing one. `mode'\r\n defaults to 'r'.\r\n If `fileobj' is given, it is used for reading or writing data. If it\r\n can be determined, `mode' is overridden by `fileobj's mode.\r\n `fileobj' is not closed, when TarFile is closed.\r\n \"\"\"\r\n modes = {\"r\": \"rb\", \"a\": \"r+b\", \"w\": \"wb\", \"x\": \"xb\"}\r\n if mode not in modes:\r\n raise ValueError(\"mode must be 'r', 'a', 'w' or 'x'\")\r\n self.mode = mode\r\n self._mode = modes[mode]\r\n \r\n if not fileobj:\r\n if self.mode == \"a\" and not os.path.exists(name):\r\n # Create nonexistent files in append mode.\r\n self.mode = \"w\"\r\n self._mode = \"wb\"\r\n> fileobj = bltn_open(name, self._mode)\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmplby3znrd.tar'\r\n\r\n..\\conda\\envs\\py37\\lib\\tarfile.py:1436: PermissionError\r\n___________________________ Tester.test_extract_zip ___________________________\r\n\r\nself = \r\n\r\n def test_extract_zip(self):\r\n with get_tmp_dir() as temp_dir:\r\n with tempfile.NamedTemporaryFile(suffix='.zip') as f:\r\n with zipfile.ZipFile(f, 'w') as zf:\r\n zf.writestr('file.tst', 'this is the content')\r\n> utils.extract_archive(f.name, temp_dir)\r\n\r\ntest\\test_datasets_utils.py:77: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\utils.py:231: in extract_archive\r\n with zipfile.ZipFile(from_path, 'r') as z:\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = \r\nfile = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpiwmc4x4z.zip', mode = 'r'\r\ncompression = 0, allowZip64 = True, compresslevel = None\r\n\r\n def __init__(self, file, mode=\"r\", compression=ZIP_STORED, allowZip64=True,\r\n compresslevel=None):\r\n \"\"\"Open the ZIP file with mode read 'r', write 'w', exclusive create 'x',\r\n or append 'a'.\"\"\"\r\n if mode not in ('r', 'w', 'x', 'a'):\r\n raise ValueError(\"ZipFile requires mode 'r', 'w', 'x', or 'a'\")\r\n \r\n _check_compression(compression)\r\n \r\n self._allowZip64 = allowZip64\r\n self._didModify = False\r\n self.debug = 0 # Level of printing: 0 through 3\r\n self.NameToInfo = {} # Find file info given name\r\n self.filelist = [] # List of ZipInfo instances for archive\r\n self.compression = compression # Method of compression\r\n self.compresslevel = compresslevel\r\n self.mode = mode\r\n self.pwd = None\r\n self._comment = b''\r\n \r\n # Check if we were passed a file-like object\r\n if isinstance(file, os.PathLike):\r\n file = os.fspath(file)\r\n if isinstance(file, str):\r\n # No, it's a filename\r\n self._filePassed = 0\r\n self.filename = file\r\n modeDict = {'r' : 'rb', 'w': 'w+b', 'x': 'x+b', 'a' : 'r+b',\r\n 'r+b': 'w+b', 'w+b': 'wb', 'x+b': 'xb'}\r\n filemode = modeDict[mode]\r\n while True:\r\n try:\r\n> self.fp = io.open(file, filemode)\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpiwmc4x4z.zip'\r\n\r\n..\\conda\\envs\\py37\\lib\\zipfile.py:1207: PermissionError\r\n___________________________ Tester.test_video_clips ___________________________\r\n\r\nself = \r\n\r\n def test_video_clips(self):\r\n with get_list_of_videos(num_videos=3) as video_list:\r\n> video_clips = VideoClips(video_list, 5, 5)\r\n\r\ntest\\test_datasets_video_utils.py:62: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\video_utils.py:55: in __init__\r\n self._compute_frame_pts()\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\video_utils.py:84: in _compute_frame_pts\r\n for batch in dl:\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:278: in __iter__\r\n return _MultiProcessingDataLoaderIter(self)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:682: in __init__\r\n w.start()\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\process.py:112: in start\r\n self._popen = self._Popen(self)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:223: in _Popen\r\n return _default_context.get_context().Process._Popen(process_obj)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:322: in _Popen\r\n return Popen(process_obj)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\popen_spawn_win32.py:89: in __init__\r\n reduction.dump(process_obj, to_child)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nobj = , file = <_io.BufferedWriter name=10>\r\nprotocol = None\r\n\r\n def dump(obj, file, protocol=None):\r\n '''Replacement for pickle.dump() using ForkingPickler.'''\r\n> ForkingPickler(file, protocol).dump(obj)\r\nE AttributeError: Can't pickle local object 'VideoClips._compute_frame_pts..DS'\r\n\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\reduction.py:60: AttributeError\r\n---------------------------- Captured stderr call -----------------------------\r\n\r\n_____________________ Tester.test_video_clips_custom_fps ______________________\r\n\r\nself = \r\n\r\n def test_video_clips_custom_fps(self):\r\n with get_list_of_videos(num_videos=3, sizes=[12, 12, 12], fps=[3, 4, 6]) as video_list:\r\n num_frames = 4\r\n for fps in [1, 3, 4, 10]:\r\n> video_clips = VideoClips(video_list, num_frames, num_frames, fps)\r\n\r\ntest\\test_datasets_video_utils.py:117: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\video_utils.py:55: in __init__\r\n self._compute_frame_pts()\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\video_utils.py:84: in _compute_frame_pts\r\n for batch in dl:\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:278: in __iter__\r\n return _MultiProcessingDataLoaderIter(self)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:682: in __init__\r\n w.start()\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\process.py:112: in start\r\n self._popen = self._Popen(self)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:223: in _Popen\r\n return _default_context.get_context().Process._Popen(process_obj)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:322: in _Popen\r\n return Popen(process_obj)\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\popen_spawn_win32.py:89: in __init__\r\n reduction.dump(process_obj, to_child)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nobj = , file = <_io.BufferedWriter name=10>\r\nprotocol = None\r\n\r\n def dump(obj, file, protocol=None):\r\n '''Replacement for pickle.dump() using ForkingPickler.'''\r\n> ForkingPickler(file, protocol).dump(obj)\r\nE AttributeError: Can't pickle local object 'VideoClips._compute_frame_pts..DS'\r\n\r\n..\\conda\\envs\\py37\\lib\\multiprocessing\\reduction.py:60: AttributeError\r\n---------------------------- Captured stderr call -----------------------------\r\nTraceback (most recent call last):\r\n\r\n File \"\", line 1, in \r\n\r\n File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 105, in spawn_main\r\n\r\n exitcode = _main(fd)\r\n\r\n File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 115, in _main\r\n\r\n self = reduction.pickle.load(from_parent)\r\n\r\nEOFError: Ran out of input\r\n\r\n\r\n-------------------------- Captured stderr teardown ---------------------------\r\nTraceback (most recent call last):\r\n\r\n File \"\", line 1, in \r\n\r\n File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 105, in spawn_main\r\n\r\n exitcode = _main(fd)\r\n\r\n File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 115, in _main\r\n\r\n self = reduction.pickle.load(from_parent)\r\n\r\nEOFError: Ran out of input\r\n\r\n_______________________ Tester.test_read_partial_video ________________________\r\n\r\nself = \r\n\r\n def test_read_partial_video(self):\r\n> with temp_video(10, 300, 300, 5, lossless=True) as (f_name, data):\r\n\r\ntest\\test_io.py:84: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n___________________ Tester.test_read_partial_video_bframes ____________________\r\n\r\nself = \r\n\r\n def test_read_partial_video_bframes(self):\r\n # do not use lossless encoding, to test the presence of B-frames\r\n options = {'bframes': '16', 'keyint': '10', 'min-keyint': '4'}\r\n> with temp_video(100, 300, 300, 5, options=options) as (f_name, data):\r\n\r\ntest\\test_io.py:100: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n_________________________ Tester.test_read_timestamps _________________________\r\n\r\nself = \r\n\r\n def test_read_timestamps(self):\r\n> with temp_video(10, 300, 300, 5) as (f_name, data):\r\n\r\ntest\\test_io.py:69: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:59: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n___________________ Tester.test_read_timestamps_from_packet ___________________\r\n\r\nself = \r\n\r\n def test_read_timestamps_from_packet(self):\r\n> with temp_video(10, 300, 300, 5, video_codec='mpeg4') as (f_name, data):\r\n\r\ntest\\test_io.py:129: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n________________________ Tester.test_write_read_video _________________________\r\n\r\nself = \r\n\r\n def test_write_read_video(self):\r\n> with temp_video(10, 300, 300, 5, lossless=True) as (f_name, data):\r\n\r\ntest\\test_io.py:62: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\contextlib.py:112: in __enter__\r\n return next(self.gen)\r\ntest\\test_io.py:51: in temp_video\r\n io.write_video(f.name, data, fps=fps, video_codec=video_codec, options=options)\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in write_video\r\n container.mux(packet)\r\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\r\n ???\r\nav/container/output.pyx:204: in av.container.output.OutputContainer.mux_one\r\n ???\r\nav/container/output.pyx:166: in av.container.output.OutputContainer.start_encoding\r\n ???\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\n> ???\r\nE av.AVError: [Errno 13] Permission denied\r\n\r\nav/utils.pyx:109: AVError\r\n___________________________ Tester.test_save_image ____________________________\r\n\r\nself = \r\n\r\n def test_save_image(self):\r\n with tempfile.NamedTemporaryFile(suffix='.png') as f:\r\n t = torch.rand(2, 3, 64, 64)\r\n> utils.save_image(t, f.name)\r\n\r\ntest\\test_utils.py:43: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\utils.py:105: in save_image\r\n im.save(filename)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = \r\nfp = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpm0s9rq8o.png'\r\nformat = 'PNG', params = {}\r\nfilename = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpm0s9rq8o.png'\r\nopen_fp = True, save_all = False, ext = '.png'\r\nsave_handler = \r\n\r\n def save(self, fp, format=None, **params):\r\n \"\"\"\r\n Saves this image under the given filename. If no format is\r\n specified, the format to use is determined from the filename\r\n extension, if possible.\r\n \r\n Keyword options can be used to provide additional instructions\r\n to the writer. If a writer doesn't recognise an option, it is\r\n silently ignored. The available options are described in the\r\n :doc:`image format documentation\r\n <../handbook/image-file-formats>` for each writer.\r\n \r\n You can use a file object instead of a filename. In this case,\r\n you must always specify the format. The file object must\r\n implement the ``seek``, ``tell``, and ``write``\r\n methods, and be opened in binary mode.\r\n \r\n :param fp: A filename (string), pathlib.Path object or file object.\r\n :param format: Optional format override. If omitted, the\r\n format to use is determined from the filename extension.\r\n If a file object was used instead of a filename, this\r\n parameter should always be used.\r\n :param params: Extra parameters to the image writer.\r\n :returns: None\r\n :exception ValueError: If the output format could not be determined\r\n from the file name. Use the format option to solve this.\r\n :exception IOError: If the file could not be written. The file\r\n may have been created, and may contain partial data.\r\n \"\"\"\r\n \r\n filename = \"\"\r\n open_fp = False\r\n if isPath(fp):\r\n filename = fp\r\n open_fp = True\r\n elif HAS_PATHLIB and isinstance(fp, Path):\r\n filename = str(fp)\r\n open_fp = True\r\n if not filename and hasattr(fp, \"name\") and isPath(fp.name):\r\n # only set the name for metadata purposes\r\n filename = fp.name\r\n \r\n # may mutate self!\r\n self._ensure_mutable()\r\n \r\n save_all = params.pop(\"save_all\", False)\r\n self.encoderinfo = params\r\n self.encoderconfig = ()\r\n \r\n preinit()\r\n \r\n ext = os.path.splitext(filename)[1].lower()\r\n \r\n if not format:\r\n if ext not in EXTENSION:\r\n init()\r\n try:\r\n format = EXTENSION[ext]\r\n except KeyError:\r\n raise ValueError(\"unknown file extension: {}\".format(ext))\r\n \r\n if format.upper() not in SAVE:\r\n init()\r\n if save_all:\r\n save_handler = SAVE_ALL[format.upper()]\r\n else:\r\n save_handler = SAVE[format.upper()]\r\n \r\n if open_fp:\r\n if params.get(\"append\", False):\r\n fp = builtins.open(filename, \"r+b\")\r\n else:\r\n # Open also for reading (\"+\"), because TIFF save_all\r\n # writer needs to go back and edit the written data.\r\n> fp = builtins.open(filename, \"w+b\")\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpm0s9rq8o.png'\r\n\r\n..\\conda\\envs\\py37\\lib\\site-packages\\PIL\\Image.py:2085: PermissionError\r\n_____________________ Tester.test_save_image_single_pixel _____________________\r\n\r\nself = \r\n\r\n def test_save_image_single_pixel(self):\r\n with tempfile.NamedTemporaryFile(suffix='.png') as f:\r\n t = torch.rand(1, 3, 1, 1)\r\n> utils.save_image(t, f.name)\r\n\r\ntest\\test_utils.py:49: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\utils.py:105: in save_image\r\n im.save(filename)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = \r\nfp = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpn5hd8b_0.png'\r\nformat = 'PNG', params = {}\r\nfilename = 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpn5hd8b_0.png'\r\nopen_fp = True, save_all = False, ext = '.png'\r\nsave_handler = \r\n\r\n def save(self, fp, format=None, **params):\r\n \"\"\"\r\n Saves this image under the given filename. If no format is\r\n specified, the format to use is determined from the filename\r\n extension, if possible.\r\n \r\n Keyword options can be used to provide additional instructions\r\n to the writer. If a writer doesn't recognise an option, it is\r\n silently ignored. The available options are described in the\r\n :doc:`image format documentation\r\n <../handbook/image-file-formats>` for each writer.\r\n \r\n You can use a file object instead of a filename. In this case,\r\n you must always specify the format. The file object must\r\n implement the ``seek``, ``tell``, and ``write``\r\n methods, and be opened in binary mode.\r\n \r\n :param fp: A filename (string), pathlib.Path object or file object.\r\n :param format: Optional format override. If omitted, the\r\n format to use is determined from the filename extension.\r\n If a file object was used instead of a filename, this\r\n parameter should always be used.\r\n :param params: Extra parameters to the image writer.\r\n :returns: None\r\n :exception ValueError: If the output format could not be determined\r\n from the file name. Use the format option to solve this.\r\n :exception IOError: If the file could not be written. The file\r\n may have been created, and may contain partial data.\r\n \"\"\"\r\n \r\n filename = \"\"\r\n open_fp = False\r\n if isPath(fp):\r\n filename = fp\r\n open_fp = True\r\n elif HAS_PATHLIB and isinstance(fp, Path):\r\n filename = str(fp)\r\n open_fp = True\r\n if not filename and hasattr(fp, \"name\") and isPath(fp.name):\r\n # only set the name for metadata purposes\r\n filename = fp.name\r\n \r\n # may mutate self!\r\n self._ensure_mutable()\r\n \r\n save_all = params.pop(\"save_all\", False)\r\n self.encoderinfo = params\r\n self.encoderconfig = ()\r\n \r\n preinit()\r\n \r\n ext = os.path.splitext(filename)[1].lower()\r\n \r\n if not format:\r\n if ext not in EXTENSION:\r\n init()\r\n try:\r\n format = EXTENSION[ext]\r\n except KeyError:\r\n raise ValueError(\"unknown file extension: {}\".format(ext))\r\n \r\n if format.upper() not in SAVE:\r\n init()\r\n if save_all:\r\n save_handler = SAVE_ALL[format.upper()]\r\n else:\r\n save_handler = SAVE[format.upper()]\r\n \r\n if open_fp:\r\n if params.get(\"append\", False):\r\n fp = builtins.open(filename, \"r+b\")\r\n else:\r\n # Open also for reading (\"+\"), because TIFF save_all\r\n # writer needs to go back and edit the written data.\r\n> fp = builtins.open(filename, \"w+b\")\r\nE PermissionError: [Errno 13] Permission denied: 'C:\\\\Users\\\\ADMINI~1\\\\AppData\\\\Local\\\\Temp\\\\tmpn5hd8b_0.png'\r\n\r\n..\\conda\\envs\\py37\\lib\\site-packages\\PIL\\Image.py:2085: PermissionError\r\n============================== warnings summary ===============================\r\nc:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\lsun.py:8\r\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\lsun.py:8: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working\r\n from collections import Iterable\r\n\r\nc:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\av\\container\\__init__.py:1\r\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\av\\container\\__init__.py:1: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working\r\n from .core import Container, open\r\n\r\ntest/test_datasets.py::Tester::test_imagenet\r\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\importlib\\_bootstrap.py:219: RuntimeWarning: numpy.ufunc size changed, may indicate binary incompatibility. Expected 192 from C header, got 216 from PyObject\r\n return f(*args, **kwds)\r\n\r\ntest/test_transforms.py::Tester::test_randomperspective\r\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\transforms\\functional.py:440: UserWarning: torch.gels is deprecated in favour of torch.lstsq and will be removed in the next release. Please use torch.lstsq instead.\r\n res = torch.gels(B, A)[0]\r\n\r\n-- Docs: https://docs.pytest.org/en/latest/warnings.html\r\n======= 32 failed, 141 passed, 14 skipped, 4 warnings in 407.68 seconds =======\r\n```","html":"

Several tests fail on Windows with 0.4.0

\n\n

Test log:\n```\n============================= test session starts =============================\nplatform win32 -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0\nrootdir: C:\\w\\2\\s\\packaging\\windows\\vision\ncollected 187 items

\n\n

test\\testbackboneutils.py .. [ 1%]\ntest\\testcppmodels.py FFFFF..FFFF........FFFFFFFFFF.. [ 17%]\ntest\\testdatasets.py ..F...... [ 22%]\ntest\\testdatasetstransforms.py .. [ 23%]\ntest\\testdatasetsutils.py .....FFF. [ 28%]\ntest\\testdatasetsvideoutils.py ..FFss [ 31%]\ntest\\testio.py .FFFFF [ 34%]\ntest\\testmodels.py ................................................ [ 60%]\ntest\\testops.py ..s..s.s.s.s.s.s.s.s [ 71%]\ntest\\testtransforms.py ..........sss................................... [ 96%]\n.. [ 97%]\ntest\\test_utils.py ..FF [100%]

\n\n

================================== FAILURES ===================================\n________ Tester.testalexnet _________

\n\n

self =

\n\n
def test_alexnet(self):\n
\n\n
\n
  process_model(models.alexnet(self.pretrained), self.image, _C_tests.forward_alexnet, 'Alexnet')\n
\n
\n\n

test\\testcppmodels.py:43:

\n\n
\n\n

model = AlexNet(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(11, 11), stride=(4, 4), padding=(2, 2))\n (1)...ures=4096, bias=True)\n (5): ReLU(inplace=True)\n (6): Linear(infeatures=4096, out_features=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'Alexnet'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testdensenet121 _________

\n\n

self =

\n\n
def test_densenet121(self):\n
\n\n
\n
  process_model(models.densenet121(self.pretrained), self.image, _C_tests.forward_densenet121, 'Densenet121')\n
\n
\n\n

test\\testcppmodels.py:105:

\n\n
\n\n

model = DenseNet(\n (features): Sequential(\n (conv0): Conv2d(3, 64, kernelsize=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, trackrunningstats=True)\n )\n (classifier): Linear(infeatures=1024, out_features=1000, bias=True)\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'Densenet121'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testdensenet161 _________

\n\n

self =

\n\n
def test_densenet161(self):\n
\n\n
\n
  process_model(models.densenet161(self.pretrained), self.image, _C_tests.forward_densenet161, 'Densenet161')\n
\n
\n\n

test\\testcppmodels.py:114:

\n\n
\n\n

model = DenseNet(\n (features): Sequential(\n (conv0): Conv2d(3, 96, kernelsize=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, trackrunningstats=True)\n )\n (classifier): Linear(infeatures=2208, out_features=1000, bias=True)\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'Densenet161'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testdensenet169 _________

\n\n

self =

\n\n
def test_densenet169(self):\n
\n\n
\n
  process_model(models.densenet169(self.pretrained), self.image, _C_tests.forward_densenet169, 'Densenet169')\n
\n
\n\n

test\\testcppmodels.py:108:

\n\n
\n\n

model = DenseNet(\n (features): Sequential(\n (conv0): Conv2d(3, 64, kernelsize=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, trackrunningstats=True)\n )\n (classifier): Linear(infeatures=1664, out_features=1000, bias=True)\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'Densenet169'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testdensenet201 _________

\n\n

self =

\n\n
def test_densenet201(self):\n
\n\n
\n
  process_model(models.densenet201(self.pretrained), self.image, _C_tests.forward_densenet201, 'Densenet201')\n
\n
\n\n

test\\testcppmodels.py:111:

\n\n
\n\n

model = DenseNet(\n (features): Sequential(\n (conv0): Conv2d(3, 64, kernelsize=(7, 7), stride=(2, 2), padding=(3, 3), bias....1, affine=True, trackrunningstats=True)\n )\n (classifier): Linear(infeatures=1920, out_features=1000, bias=True)\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'Densenet201'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testmnasnet05 _________

\n\n

self =

\n\n
def test_mnasnet0_5(self):\n
\n\n
\n
  process_model(models.mnasnet0_5(self.pretrained), self.image, _C_tests.forward_mnasnet0_5, 'MNASNet0_5')\n
\n
\n\n

test\\testcppmodels.py:123:

\n\n
\n\n

model = MNASNet(\n (layers): Sequential(\n (0): Conv2d(3, 32, kernelsize=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\n (0): Dropout(p=0.2, inplace=True)\n (1): Linear(infeatures=1280, outfeatures=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'MNASNet05'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testmnasnet075 ________

\n\n

self =

\n\n
def test_mnasnet0_75(self):\n
\n\n
\n
  process_model(models.mnasnet0_75(self.pretrained), self.image, _C_tests.forward_mnasnet0_75, 'MNASNet0_75')\n
\n
\n\n

test\\testcppmodels.py:126:

\n\n
\n\n

model = MNASNet(\n (layers): Sequential(\n (0): Conv2d(3, 32, kernelsize=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\n (0): Dropout(p=0.2, inplace=True)\n (1): Linear(infeatures=1280, outfeatures=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'MNASNet075'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testmnasnet10 _________

\n\n

self =

\n\n
def test_mnasnet1_0(self):\n
\n\n
\n
  process_model(models.mnasnet1_0(self.pretrained), self.image, _C_tests.forward_mnasnet1_0, 'MNASNet1_0')\n
\n
\n\n

test\\testcppmodels.py:129:

\n\n
\n\n

model = MNASNet(\n (layers): Sequential(\n (0): Conv2d(3, 32, kernelsize=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\n (0): Dropout(p=0.2, inplace=True)\n (1): Linear(infeatures=1280, outfeatures=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'MNASNet10'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testmnasnet13 _________

\n\n

self =

\n\n
def test_mnasnet1_3(self):\n
\n\n
\n
  process_model(models.mnasnet1_3(self.pretrained), self.image, _C_tests.forward_mnasnet1_3, 'MNASNet1_3')\n
\n
\n\n

test\\testcppmodels.py:132:

\n\n
\n\n

model = MNASNet(\n (layers): Sequential(\n (0): Conv2d(3, 32, kernelsize=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)...Sequential(\n (0): Dropout(p=0.2, inplace=True)\n (1): Linear(infeatures=1280, outfeatures=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'MNASNet13'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n_______ Tester.testsqueezenet10 _______

\n\n

self =

\n\n
def test_squeezenet1_0(self):\n    process_model(models.squeezenet1_0(self.pretrained), self.image,\n
\n\n
\n
                _C_tests.forward_squeezenet1_0, 'Squeezenet1.0')\n
\n
\n\n

test\\testcppmodels.py:98:

\n\n
\n\n

model = SqueezeNet(\n (features): Sequential(\n (0): Conv2d(3, 96, kernelsize=(7, 7), stride=(2, 2))\n (1): ReLU(inplace=...00, kernelsize=(1, 1), stride=(1, 1))\n (2): ReLU(inplace=True)\n (3): AdaptiveAvgPool2d(output_size=(1, 1))\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'Squeezenet1.0'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n_______ Tester.testsqueezenet11 _______

\n\n

self =

\n\n
def test_squeezenet1_1(self):\n    process_model(models.squeezenet1_1(self.pretrained), self.image,\n
\n\n
\n
                _C_tests.forward_squeezenet1_1, 'Squeezenet1.1')\n
\n
\n\n

test\\testcppmodels.py:102:

\n\n
\n\n

model = SqueezeNet(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(3, 3), stride=(2, 2))\n (1): ReLU(inplace=...00, kernelsize=(1, 1), stride=(1, 1))\n (2): ReLU(inplace=True)\n (3): AdaptiveAvgPool2d(output_size=(1, 1))\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'Squeezenet1.1'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n_________ Tester.testvgg11 __________

\n\n

self =

\n\n
def test_vgg11(self):\n
\n\n
\n
  process_model(models.vgg11(self.pretrained), self.image, _C_tests.forward_vgg11, 'VGG11')\n
\n
\n\n

test\\testcppmodels.py:46:

\n\n
\n\n

model = VGG(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(3, 3), stride=(1, 1), padding=(1, 1))\n (1): ReLU...lace=True)\n (5): Dropout(p=0.5, inplace=False)\n (6): Linear(infeatures=4096, out_features=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'VGG11'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testvgg11bn _________

\n\n

self =

\n\n
def test_vgg11_bn(self):\n
\n\n
\n
  process_model(models.vgg11_bn(self.pretrained), self.image, _C_tests.forward_vgg11bn, 'VGG11BN')\n
\n
\n\n

test\\testcppmodels.py:58:

\n\n
\n\n

model = VGG(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(3, 3), stride=(1, 1), padding=(1, 1))\n (1): Batc...lace=True)\n (5): Dropout(p=0.5, inplace=False)\n (6): Linear(infeatures=4096, out_features=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'VGG11BN'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torch\\include\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n_________ Tester.testvgg13 __________

\n\n

self =

\n\n
def test_vgg13(self):\n
\n\n
\n
  process_model(models.vgg13(self.pretrained), self.image, _C_tests.forward_vgg13, 'VGG13')\n
\n
\n\n

test\\testcppmodels.py:49:

\n\n
\n\n

model = VGG(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(3, 3), stride=(1, 1), padding=(1, 1))\n (1): ReLU...lace=True)\n (5): Dropout(p=0.5, inplace=False)\n (6): Linear(infeatures=4096, out_features=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'VGG13'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testvgg13bn _________

\n\n

self =

\n\n
def test_vgg13_bn(self):\n
\n\n
\n
  process_model(models.vgg13_bn(self.pretrained), self.image, _C_tests.forward_vgg13bn, 'VGG13BN')\n
\n
\n\n

test\\testcppmodels.py:61:

\n\n
\n\n

model = VGG(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(3, 3), stride=(1, 1), padding=(1, 1))\n (1): Batc...lace=True)\n (5): Dropout(p=0.5, inplace=False)\n (6): Linear(infeatures=4096, out_features=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'VGG13BN'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n_________ Tester.testvgg16 __________

\n\n

self =

\n\n
def test_vgg16(self):\n
\n\n
\n
  process_model(models.vgg16(self.pretrained), self.image, _C_tests.forward_vgg16, 'VGG16')\n
\n
\n\n

test\\testcppmodels.py:52:

\n\n
\n\n

model = VGG(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(3, 3), stride=(1, 1), padding=(1, 1))\n (1): ReLU...lace=True)\n (5): Dropout(p=0.5, inplace=False)\n (6): Linear(infeatures=4096, out_features=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'VGG16'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testvgg16bn _________

\n\n

self =

\n\n
def test_vgg16_bn(self):\n
\n\n
\n
  process_model(models.vgg16_bn(self.pretrained), self.image, _C_tests.forward_vgg16bn, 'VGG16BN')\n
\n
\n\n

test\\testcppmodels.py:64:

\n\n
\n\n

model = VGG(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(3, 3), stride=(1, 1), padding=(1, 1))\n (1): Batc...lace=True)\n (5): Dropout(p=0.5, inplace=False)\n (6): Linear(infeatures=4096, out_features=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'VGG16BN'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n_________ Tester.testvgg19 __________

\n\n

self =

\n\n
def test_vgg19(self):\n
\n\n
\n
  process_model(models.vgg19(self.pretrained), self.image, _C_tests.forward_vgg19, 'VGG19')\n
\n
\n\n

test\\testcppmodels.py:55:

\n\n
\n\n

model = VGG(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(3, 3), stride=(1, 1), padding=(1, 1))\n (1): ReLU...lace=True)\n (5): Dropout(p=0.5, inplace=False)\n (6): Linear(infeatures=4096, out_features=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'VGG19'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testvgg19bn _________

\n\n

self =

\n\n
def test_vgg19_bn(self):\n
\n\n
\n
  process_model(models.vgg19_bn(self.pretrained), self.image, _C_tests.forward_vgg19bn, 'VGG19BN')\n
\n
\n\n

test\\testcppmodels.py:67:

\n\n
\n\n

model = VGG(\n (features): Sequential(\n (0): Conv2d(3, 64, kernelsize=(3, 3), stride=(1, 1), padding=(1, 1))\n (1): Batc...lace=True)\n (5): Dropout(p=0.5, inplace=False)\n (6): Linear(infeatures=4096, out_features=1000, bias=True)\n )\n)\ntensor = tensor([[[[0.0902, 0.1098, 0.1216, ..., 0.2824, 0.2314, 0.2392],\n [0.0980, 0.0863, 0.1020, ..., 0.3333, 0.3...20, 0.1059, 0.0980, ..., 0.0667, 0.0784, 0.0706],\n [0.1059, 0.0941, 0.0980, ..., 0.0588, 0.0667, 0.0667]]]])\nfunc = \nname = 'VGG19BN'

\n\n
def process_model(model, tensor, func, name):\n    model.eval()\n    traced_script_module = torch.jit.trace(model, tensor)\n    traced_script_module.save(\"model.pt\")\n\n    py_output = model.forward(tensor)\n
\n\n
\n
  cpp_output = func(\"model.pt\", tensor)\n
\n \n

E RuntimeError: undefined Tensor (inferisvariable at C:\\w\\1\\s\\windows\\pytorch\\build\\aten\\src\\ATen/Functions.h:1149)\n E (no backtrace available)

\n
\n\n

test\\testcppmodels.py:16: RuntimeError\n________ Tester.testcityscapes ________

\n\n

self =

\n\n
def test_cityscapes(self):\n    with cityscapes_root() as root:\n\n        for mode in ['coarse', 'fine']:\n\n            if mode == 'coarse':\n                splits = ['train', 'train_extra', 'val']\n            else:\n                splits = ['train', 'val', 'test']\n\n            for split in splits:\n                for target_type in ['semantic', 'instance']:\n                    dataset = torchvision.datasets.Cityscapes(root, split=split,\n                                                              target_type=target_type, mode=mode)\n                    self.generic_segmentation_dataset_test(dataset, num_images=2)\n\n                color_dataset = torchvision.datasets.Cityscapes(root, split=split,\n                                                                target_type='color', mode=mode)\n                color_img, color_target = color_dataset[0]\n                self.assertTrue(isinstance(color_img, PIL.Image.Image))\n                self.assertTrue(np.array(color_target).shape[2] == 4)\n\n                polygon_dataset = torchvision.datasets.Cityscapes(root, split=split,\n                                                                  target_type='polygon', mode=mode)\n                polygon_img, polygon_target = polygon_dataset[0]\n                self.assertTrue(isinstance(polygon_img, PIL.Image.Image))\n                self.assertTrue(isinstance(polygon_target, dict))\n                self.assertTrue(isinstance(polygon_target['imgHeight'], int))\n                self.assertTrue(isinstance(polygon_target['objects'], list))\n\n                # Test multiple target types\n                targets_combo = ['semantic', 'polygon', 'color']\n                multiple_types_dataset = torchvision.datasets.Cityscapes(root, split=split,\n                                                                         target_type=targets_combo,\n                                                                         mode=mode)\n                output = multiple_types_dataset[0]\n                self.assertTrue(isinstance(output, tuple))\n                self.assertTrue(len(output) == 2)\n                self.assertTrue(isinstance(output[0], PIL.Image.Image))\n                self.assertTrue(isinstance(output[1], tuple))\n                self.assertTrue(len(output[1]) == 3)\n                self.assertTrue(isinstance(output[1][0], PIL.Image.Image))  # semantic\n                self.assertTrue(isinstance(output[1][1], dict))  # polygon\n
\n\n
\n
              self.assertTrue(isinstance(output[1][2], PIL.Image.Image))  # color\n
\n
\n\n

test\\test_datasets.py:195:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\contextlib.py:119: in exit\n next(self.gen)\ntest\\fakedatageneration.py:243: in cityscapesroot\n yield tmpdir\n..\\conda\\envs\\py37\\lib\\contextlib.py:119: in exit\n next(self.gen)\ntest\\commonutils.py:16: in gettmpdir\n shutil.rmtree(tmpdir)\n..\\conda\\envs\\py37\\lib\\shutil.py:516: in rmtree\n return _rmtreeunsafe(path, onerror)\n..\\conda\\envs\\py37\\lib\\shutil.py:395: in rmtreeunsafe\n rmtreeunsafe(fullname, onerror)\n..\\conda\\envs\\py37\\lib\\shutil.py:395: in rmtreeunsafe\n rmtreeunsafe(fullname, onerror)\n..\\conda\\envs\\py37\\lib\\shutil.py:395: in rmtreeunsafe\n rmtreeunsafe(fullname, onerror)\n..\\conda\\envs\\py37\\lib\\shutil.py:400: in rmtreeunsafe\n onerror(os.unlink, fullname, sys.exc_info())

\n\n
\n\n

path = 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmp5etnebcf\\gtFine\\test\\bochum'\nonerror = .onerror at 0x000000323F3BFDC8>

\n\n
def _rmtree_unsafe(path, onerror):\n    try:\n        with os.scandir(path) as scandir_it:\n            entries = list(scandir_it)\n    except OSError:\n        onerror(os.scandir, path, sys.exc_info())\n        entries = []\n    for entry in entries:\n        fullname = entry.path\n        try:\n            is_dir = entry.is_dir(follow_symlinks=False)\n        except OSError:\n            is_dir = False\n        if is_dir:\n            try:\n                if entry.is_symlink():\n                    # This can only happen if someone replaces\n                    # a directory with a symlink after the call to\n                    # os.scandir or entry.is_dir above.\n                    raise OSError(\"Cannot call rmtree on a symbolic link\")\n            except OSError:\n                onerror(os.path.islink, fullname, sys.exc_info())\n                continue\n            _rmtree_unsafe(fullname, onerror)\n        else:\n            try:\n
\n\n
\n
              os.unlink(fullname)\n
\n \n

E PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmp5etnebcf\\gtFine\\test\\bochum\\bochum000000000000gtFinecolor.png'

\n
\n\n

..\\conda\\envs\\py37\\lib\\shutil.py:398: PermissionError\n_______ Tester.testextractgzip ________

\n\n

self =

\n\n
def test_extract_gzip(self):\n    with get_tmp_dir() as temp_dir:\n        with tempfile.NamedTemporaryFile(suffix='.gz') as f:\n
\n\n
\n
          with gzip.GzipFile(f.name, 'wb') as zf:\n
\n
\n\n

test\\testdatasetsutils.py:101:

\n\n
\n\n

self = <[AttributeError(\"'GzipFile' object has no attribute 'fileobj'\") raised in repr()] GzipFile object at 0x32007d28c8>\nfilename = 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmpc1wq6shu.gz'\nmode = 'wb', compresslevel = 9, fileobj = None, mtime = None

\n\n
def __init__(self, filename=None, mode=None,\n             compresslevel=9, fileobj=None, mtime=None):\n    \"\"\"Constructor for the GzipFile class.\n\n    At least one of fileobj and filename must be given a\n    non-trivial value.\n\n    The new class instance is based on fileobj, which can be a regular\n    file, an io.BytesIO object, or any other object which simulates a file.\n    It defaults to None, in which case filename is opened to provide\n    a file object.\n\n    When fileobj is not None, the filename argument is only used to be\n    included in the gzip file header, which may include the original\n    filename of the uncompressed file.  It defaults to the filename of\n    fileobj, if discernible; otherwise, it defaults to the empty string,\n    and in this case the original filename is not included in the header.\n\n    The mode argument can be any of 'r', 'rb', 'a', 'ab', 'w', 'wb', 'x', or\n    'xb' depending on whether the file will be read or written.  The default\n    is the mode of fileobj if discernible; otherwise, the default is 'rb'.\n    A mode of 'r' is equivalent to one of 'rb', and similarly for 'w' and\n    'wb', 'a' and 'ab', and 'x' and 'xb'.\n\n    The compresslevel argument is an integer from 0 to 9 controlling the\n    level of compression; 1 is fastest and produces the least compression,\n    and 9 is slowest and produces the most compression. 0 is no compression\n    at all. The default is 9.\n\n    The mtime argument is an optional numeric timestamp to be written\n    to the last modification time field in the stream when compressing.\n    If omitted or None, the current time is used.\n\n    \"\"\"\n\n    if mode and ('t' in mode or 'U' in mode):\n        raise ValueError(\"Invalid mode: {!r}\".format(mode))\n    if mode and 'b' not in mode:\n        mode += 'b'\n    if fileobj is None:\n
\n\n
\n
      fileobj = self.myfileobj = builtins.open(filename, mode or 'rb')\n
\n \n

E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmpc1wq6shu.gz'

\n
\n\n

..\\conda\\envs\\py37\\lib\\gzip.py:163: PermissionError\n________ Tester.testextracttar ________

\n\n

self =

\n\n
def test_extract_tar(self):\n    for ext, mode in zip(['.tar', '.tar.gz'], ['w', 'w:gz']):\n        with get_tmp_dir() as temp_dir:\n            with tempfile.NamedTemporaryFile() as bf:\n                bf.write(\"this is the content\".encode())\n                bf.seek(0)\n                with tempfile.NamedTemporaryFile(suffix=ext) as f:\n
\n\n
\n
                  with tarfile.open(f.name, mode=mode) as zf:\n
\n
\n\n

test\\testdatasetsutils.py:90:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\tarfile.py:1611: in open\n return cls.taropen(name, mode, fileobj, *kwargs)\n..\\conda\\envs\\py37\\lib\\tarfile.py:1621: in taropen\n return cls(name, mode, fileobj, *kwargs)

\n\n
\n\n

self = \nname = 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmplby3znrd.tar', mode = 'w'\nfileobj = None, format = None, tarinfo = None, dereference = None\nignorezeros = None, encoding = None, errors = 'surrogateescape'\npaxheaders = None, debug = None, errorlevel = None, copybufsize = None

\n\n
def __init__(self, name=None, mode=\"r\", fileobj=None, format=None,\n        tarinfo=None, dereference=None, ignore_zeros=None, encoding=None,\n        errors=\"surrogateescape\", pax_headers=None, debug=None,\n        errorlevel=None, copybufsize=None):\n    \"\"\"Open an (uncompressed) tar archive `name'. `mode' is either 'r' to\n       read from an existing archive, 'a' to append data to an existing\n       file or 'w' to create a new file overwriting an existing one. `mode'\n       defaults to 'r'.\n       If `fileobj' is given, it is used for reading or writing data. If it\n       can be determined, `mode' is overridden by `fileobj's mode.\n       `fileobj' is not closed, when TarFile is closed.\n    \"\"\"\n    modes = {\"r\": \"rb\", \"a\": \"r+b\", \"w\": \"wb\", \"x\": \"xb\"}\n    if mode not in modes:\n        raise ValueError(\"mode must be 'r', 'a', 'w' or 'x'\")\n    self.mode = mode\n    self._mode = modes[mode]\n\n    if not fileobj:\n        if self.mode == \"a\" and not os.path.exists(name):\n            # Create nonexistent files in append mode.\n            self.mode = \"w\"\n            self._mode = \"wb\"\n
\n\n
\n
      fileobj = bltn_open(name, self._mode)\n
\n \n

E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmplby3znrd.tar'

\n
\n\n

..\\conda\\envs\\py37\\lib\\tarfile.py:1436: PermissionError\n________ Tester.testextractzip ________

\n\n

self =

\n\n
def test_extract_zip(self):\n    with get_tmp_dir() as temp_dir:\n        with tempfile.NamedTemporaryFile(suffix='.zip') as f:\n            with zipfile.ZipFile(f, 'w') as zf:\n                zf.writestr('file.tst', 'this is the content')\n
\n\n
\n
          utils.extract_archive(f.name, temp_dir)\n
\n
\n\n

test\\testdatasetsutils.py:77:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\utils.py:231: in extractarchive\n with zipfile.ZipFile(frompath, 'r') as z:

\n\n
\n\n

self = \n\n

def __init__(self, file, mode=\"r\", compression=ZIP_STORED, allowZip64=True,\n             compresslevel=None):\n    \"\"\"Open the ZIP file with mode read 'r', write 'w', exclusive create 'x',\n    or append 'a'.\"\"\"\n    if mode not in ('r', 'w', 'x', 'a'):\n        raise ValueError(\"ZipFile requires mode 'r', 'w', 'x', or 'a'\")\n\n    _check_compression(compression)\n\n    self._allowZip64 = allowZip64\n    self._didModify = False\n    self.debug = 0  # Level of printing: 0 through 3\n    self.NameToInfo = {}    # Find file info given name\n    self.filelist = []      # List of ZipInfo instances for archive\n    self.compression = compression  # Method of compression\n    self.compresslevel = compresslevel\n    self.mode = mode\n    self.pwd = None\n    self._comment = b''\n\n    # Check if we were passed a file-like object\n    if isinstance(file, os.PathLike):\n        file = os.fspath(file)\n    if isinstance(file, str):\n        # No, it's a filename\n        self._filePassed = 0\n        self.filename = file\n        modeDict = {'r' : 'rb', 'w': 'w+b', 'x': 'x+b', 'a' : 'r+b',\n                    'r+b': 'w+b', 'w+b': 'wb', 'x+b': 'xb'}\n        filemode = modeDict[mode]\n        while True:\n            try:\n
\n\n
\n
              self.fp = io.open(file, filemode)\n
\n \n

E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmpiwmc4x4z.zip'

\n
\n\n

..\\conda\\envs\\py37\\lib\\zipfile.py:1207: PermissionError\n________ Tester.testvideoclips ________

\n\n

self =

\n\n
def test_video_clips(self):\n    with get_list_of_videos(num_videos=3) as video_list:\n
\n\n
\n
      video_clips = VideoClips(video_list, 5, 5)\n
\n
\n\n

test\\testdatasetsvideo_utils.py:62:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\videoutils.py:55: in init\n self.computeframepts()\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\videoutils.py:84: in _computeframepts\n for batch in dl:\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:278: in iter\n return _MultiProcessingDataLoaderIter(self)\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:682: in init\n w.start()\n..\\conda\\envs\\py37\\lib\\multiprocessing\\process.py:112: in start\n self.popen = self.Popen(self)\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:223: in _Popen\n return _defaultcontext.getcontext().Process.Popen(processobj)\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:322: in _Popen\n return Popen(processobj)\n..\\conda\\envs\\py37\\lib\\multiprocessing\\popenspawnwin32.py:89: in init\n reduction.dump(processobj, tochild)

\n\n
\n\n

obj = \nprotocol = None

\n\n
def dump(obj, file, protocol=None):\n    '''Replacement for pickle.dump() using ForkingPickler.'''\n
\n\n
\n
  ForkingPickler(file, protocol).dump(obj)\n
\n \n

E AttributeError: Can't pickle local object 'VideoClips.computeframe_pts..DS'

\n
\n\n

..\\conda\\envs\\py37\\lib\\multiprocessing\\reduction.py:60: AttributeError\n---------------------------- Captured stderr call -----------------------------

\n\n

______ Tester.testvideoclipscustomfps _______

\n\n

self =

\n\n
def test_video_clips_custom_fps(self):\n    with get_list_of_videos(num_videos=3, sizes=[12, 12, 12], fps=[3, 4, 6]) as video_list:\n        num_frames = 4\n        for fps in [1, 3, 4, 10]:\n
\n\n
\n
          video_clips = VideoClips(video_list, num_frames, num_frames, fps)\n
\n
\n\n

test\\testdatasetsvideo_utils.py:117:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\videoutils.py:55: in init\n self.computeframepts()\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\videoutils.py:84: in _computeframepts\n for batch in dl:\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:278: in iter\n return _MultiProcessingDataLoaderIter(self)\n..\\conda\\envs\\py37\\lib\\site-packages\\torch\\utils\\data\\dataloader.py:682: in init\n w.start()\n..\\conda\\envs\\py37\\lib\\multiprocessing\\process.py:112: in start\n self.popen = self.Popen(self)\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:223: in _Popen\n return _defaultcontext.getcontext().Process.Popen(processobj)\n..\\conda\\envs\\py37\\lib\\multiprocessing\\context.py:322: in _Popen\n return Popen(processobj)\n..\\conda\\envs\\py37\\lib\\multiprocessing\\popenspawnwin32.py:89: in init\n reduction.dump(processobj, tochild)

\n\n
\n\n

obj = \nprotocol = None

\n\n
def dump(obj, file, protocol=None):\n    '''Replacement for pickle.dump() using ForkingPickler.'''\n
\n\n
\n
  ForkingPickler(file, protocol).dump(obj)\n
\n \n

E AttributeError: Can't pickle local object 'VideoClips.computeframe_pts..DS'

\n
\n\n

..\\conda\\envs\\py37\\lib\\multiprocessing\\reduction.py:60: AttributeError\n---------------------------- Captured stderr call -----------------------------\nTraceback (most recent call last):

\n\n

File \"\", line 1, in

\n\n

File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 105, in spawn_main

\n\n
exitcode = _main(fd)\n
\n\n

File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 115, in _main

\n\n
self = reduction.pickle.load(from_parent)\n
\n\n

EOFError: Ran out of input

\n\n

-------------------------- Captured stderr teardown ---------------------------\nTraceback (most recent call last):

\n\n

File \"\", line 1, in

\n\n

File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 105, in spawn_main

\n\n
exitcode = _main(fd)\n
\n\n

File \"c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\multiprocessing\\spawn.py\", line 115, in _main

\n\n
self = reduction.pickle.load(from_parent)\n
\n\n

EOFError: Ran out of input

\n\n

______ Tester.testreadpartialvideo ________

\n\n

self =

\n\n
def test_read_partial_video(self):\n
\n\n
\n
  with temp_video(10, 300, 300, 5, lossless=True) as (f_name, data):\n
\n
\n\n

test\\test_io.py:84:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\contextlib.py:112: in enter\n return next(self.gen)\ntest\\testio.py:51: in tempvideo\n io.writevideo(f.name, data, fps=fps, videocodec=videocodec, options=options)\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in writevideo\n container.mux(packet)\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\n ???\nav/container/output.pyx:204: in av.container.output.OutputContainer.muxone\n ???\nav/container/output.pyx:166: in av.container.output.OutputContainer.startencoding\n ???

\n\n
\n\n
\n

???\n E av.AVError: [Errno 13] Permission denied

\n
\n\n

av/utils.pyx:109: AVError\n_____ Tester.testreadpartialvideobframes ______

\n\n

self =

\n\n
def test_read_partial_video_bframes(self):\n    # do not use lossless encoding, to test the presence of B-frames\n    options = {'bframes': '16', 'keyint': '10', 'min-keyint': '4'}\n
\n\n
\n
  with temp_video(100, 300, 300, 5, options=options) as (f_name, data):\n
\n
\n\n

test\\test_io.py:100:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\contextlib.py:112: in enter\n return next(self.gen)\ntest\\testio.py:51: in tempvideo\n io.writevideo(f.name, data, fps=fps, videocodec=videocodec, options=options)\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in writevideo\n container.mux(packet)\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\n ???\nav/container/output.pyx:204: in av.container.output.OutputContainer.muxone\n ???\nav/container/output.pyx:166: in av.container.output.OutputContainer.startencoding\n ???

\n\n
\n\n
\n

???\n E av.AVError: [Errno 13] Permission denied

\n
\n\n

av/utils.pyx:109: AVError\n_______ Tester.testreadtimestamps _______

\n\n

self =

\n\n
def test_read_timestamps(self):\n
\n\n
\n
  with temp_video(10, 300, 300, 5) as (f_name, data):\n
\n
\n\n

test\\test_io.py:69:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\contextlib.py:112: in enter\n return next(self.gen)\ntest\\testio.py:51: in tempvideo\n io.writevideo(f.name, data, fps=fps, videocodec=videocodec, options=options)\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:59: in writevideo\n container.mux(packet)\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\n ???\nav/container/output.pyx:204: in av.container.output.OutputContainer.muxone\n ???\nav/container/output.pyx:166: in av.container.output.OutputContainer.startencoding\n ???

\n\n
\n\n
\n

???\n E av.AVError: [Errno 13] Permission denied

\n
\n\n

av/utils.pyx:109: AVError\n_____ Tester.testreadtimestampsfrompacket _____

\n\n

self =

\n\n
def test_read_timestamps_from_packet(self):\n
\n\n
\n
  with temp_video(10, 300, 300, 5, video_codec='mpeg4') as (f_name, data):\n
\n
\n\n

test\\test_io.py:129:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\contextlib.py:112: in enter\n return next(self.gen)\ntest\\testio.py:51: in tempvideo\n io.writevideo(f.name, data, fps=fps, videocodec=videocodec, options=options)\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in writevideo\n container.mux(packet)\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\n ???\nav/container/output.pyx:204: in av.container.output.OutputContainer.muxone\n ???\nav/container/output.pyx:166: in av.container.output.OutputContainer.startencoding\n ???

\n\n
\n\n
\n

???\n E av.AVError: [Errno 13] Permission denied

\n
\n\n

av/utils.pyx:109: AVError\n_______ Tester.testwritereadvideo _______

\n\n

self =

\n\n
def test_write_read_video(self):\n
\n\n
\n
  with temp_video(10, 300, 300, 5, lossless=True) as (f_name, data):\n
\n
\n\n

test\\test_io.py:62:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\contextlib.py:112: in enter\n return next(self.gen)\ntest\\testio.py:51: in tempvideo\n io.writevideo(f.name, data, fps=fps, videocodec=videocodec, options=options)\n..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\io\\video.py:55: in writevideo\n container.mux(packet)\nav/container/output.pyx:198: in av.container.output.OutputContainer.mux\n ???\nav/container/output.pyx:204: in av.container.output.OutputContainer.muxone\n ???\nav/container/output.pyx:166: in av.container.output.OutputContainer.startencoding\n ???

\n\n
\n\n
\n

???\n E av.AVError: [Errno 13] Permission denied

\n
\n\n

av/utils.pyx:109: AVError\n________ Tester.testsaveimage _________

\n\n

self =

\n\n
def test_save_image(self):\n    with tempfile.NamedTemporaryFile(suffix='.png') as f:\n        t = torch.rand(2, 3, 64, 64)\n
\n\n
\n
      utils.save_image(t, f.name)\n
\n
\n\n

test\\test_utils.py:43:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\utils.py:105: in save_image\n im.save(filename)

\n\n
\n\n

self = \nfp = 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmpm0s9rq8o.png'\nformat = 'PNG', params = {}\nfilename = 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmpm0s9rq8o.png'\nopenfp = True, saveall = False, ext = '.png'\nsave_handler =

\n\n
def save(self, fp, format=None, **params):\n    \"\"\"\n    Saves this image under the given filename.  If no format is\n    specified, the format to use is determined from the filename\n    extension, if possible.\n\n    Keyword options can be used to provide additional instructions\n    to the writer. If a writer doesn't recognise an option, it is\n    silently ignored. The available options are described in the\n    :doc:`image format documentation\n    <../handbook/image-file-formats>` for each writer.\n\n    You can use a file object instead of a filename. In this case,\n    you must always specify the format. The file object must\n    implement the ``seek``, ``tell``, and ``write``\n    methods, and be opened in binary mode.\n\n    :param fp: A filename (string), pathlib.Path object or file object.\n    :param format: Optional format override.  If omitted, the\n       format to use is determined from the filename extension.\n       If a file object was used instead of a filename, this\n       parameter should always be used.\n    :param params: Extra parameters to the image writer.\n    :returns: None\n    :exception ValueError: If the output format could not be determined\n       from the file name.  Use the format option to solve this.\n    :exception IOError: If the file could not be written.  The file\n       may have been created, and may contain partial data.\n    \"\"\"\n\n    filename = \"\"\n    open_fp = False\n    if isPath(fp):\n        filename = fp\n        open_fp = True\n    elif HAS_PATHLIB and isinstance(fp, Path):\n        filename = str(fp)\n        open_fp = True\n    if not filename and hasattr(fp, \"name\") and isPath(fp.name):\n        # only set the name for metadata purposes\n        filename = fp.name\n\n    # may mutate self!\n    self._ensure_mutable()\n\n    save_all = params.pop(\"save_all\", False)\n    self.encoderinfo = params\n    self.encoderconfig = ()\n\n    preinit()\n\n    ext = os.path.splitext(filename)[1].lower()\n\n    if not format:\n        if ext not in EXTENSION:\n            init()\n        try:\n            format = EXTENSION[ext]\n        except KeyError:\n            raise ValueError(\"unknown file extension: {}\".format(ext))\n\n    if format.upper() not in SAVE:\n        init()\n    if save_all:\n        save_handler = SAVE_ALL[format.upper()]\n    else:\n        save_handler = SAVE[format.upper()]\n\n    if open_fp:\n        if params.get(\"append\", False):\n            fp = builtins.open(filename, \"r+b\")\n        else:\n            # Open also for reading (\"+\"), because TIFF save_all\n            # writer needs to go back and edit the written data.\n
\n\n
\n
          fp = builtins.open(filename, \"w+b\")\n
\n \n

E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmpm0s9rq8o.png'

\n
\n\n

..\\conda\\envs\\py37\\lib\\site-packages\\PIL\\Image.py:2085: PermissionError\n______ Tester.testsaveimagesinglepixel ______

\n\n

self =

\n\n
def test_save_image_single_pixel(self):\n    with tempfile.NamedTemporaryFile(suffix='.png') as f:\n        t = torch.rand(1, 3, 1, 1)\n
\n\n
\n
      utils.save_image(t, f.name)\n
\n
\n\n

test\\test_utils.py:49:

\n\n
\n\n

..\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\utils.py:105: in save_image\n im.save(filename)

\n\n
\n\n

self = \nfp = 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmpn5hd8b0.png'\nformat = 'PNG', params = {}\nfilename = 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmpn5hd8b0.png'\nopenfp = True, saveall = False, ext = '.png'\nsave_handler =

\n\n
def save(self, fp, format=None, **params):\n    \"\"\"\n    Saves this image under the given filename.  If no format is\n    specified, the format to use is determined from the filename\n    extension, if possible.\n\n    Keyword options can be used to provide additional instructions\n    to the writer. If a writer doesn't recognise an option, it is\n    silently ignored. The available options are described in the\n    :doc:`image format documentation\n    <../handbook/image-file-formats>` for each writer.\n\n    You can use a file object instead of a filename. In this case,\n    you must always specify the format. The file object must\n    implement the ``seek``, ``tell``, and ``write``\n    methods, and be opened in binary mode.\n\n    :param fp: A filename (string), pathlib.Path object or file object.\n    :param format: Optional format override.  If omitted, the\n       format to use is determined from the filename extension.\n       If a file object was used instead of a filename, this\n       parameter should always be used.\n    :param params: Extra parameters to the image writer.\n    :returns: None\n    :exception ValueError: If the output format could not be determined\n       from the file name.  Use the format option to solve this.\n    :exception IOError: If the file could not be written.  The file\n       may have been created, and may contain partial data.\n    \"\"\"\n\n    filename = \"\"\n    open_fp = False\n    if isPath(fp):\n        filename = fp\n        open_fp = True\n    elif HAS_PATHLIB and isinstance(fp, Path):\n        filename = str(fp)\n        open_fp = True\n    if not filename and hasattr(fp, \"name\") and isPath(fp.name):\n        # only set the name for metadata purposes\n        filename = fp.name\n\n    # may mutate self!\n    self._ensure_mutable()\n\n    save_all = params.pop(\"save_all\", False)\n    self.encoderinfo = params\n    self.encoderconfig = ()\n\n    preinit()\n\n    ext = os.path.splitext(filename)[1].lower()\n\n    if not format:\n        if ext not in EXTENSION:\n            init()\n        try:\n            format = EXTENSION[ext]\n        except KeyError:\n            raise ValueError(\"unknown file extension: {}\".format(ext))\n\n    if format.upper() not in SAVE:\n        init()\n    if save_all:\n        save_handler = SAVE_ALL[format.upper()]\n    else:\n        save_handler = SAVE[format.upper()]\n\n    if open_fp:\n        if params.get(\"append\", False):\n            fp = builtins.open(filename, \"r+b\")\n        else:\n            # Open also for reading (\"+\"), because TIFF save_all\n            # writer needs to go back and edit the written data.\n
\n\n
\n
          fp = builtins.open(filename, \"w+b\")\n
\n \n

E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\tmpn5hd8b_0.png'

\n
\n\n

..\\conda\\envs\\py37\\lib\\site-packages\\PIL\\Image.py:2085: PermissionError\n============================== warnings summary ===============================\nc:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\lsun.py:8\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\datasets\\lsun.py:8: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working\n from collections import Iterable

\n\n

c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\av\\container_init.py:1\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\av\\container_init.py:1: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working\n from .core import Container, open

\n\n

test/testdatasets.py::Tester::testimagenet\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\importlib_bootstrap.py:219: RuntimeWarning: numpy.ufunc size changed, may indicate binary incompatibility. Expected 192 from C header, got 216 from PyObject\n return f(args, *kwds)

\n\n

test/testtransforms.py::Tester::testrandomperspective\n c:\\w\\2\\s\\packaging\\windows\\conda\\envs\\py37\\lib\\site-packages\\torchvision\\transforms\\functional.py:440: UserWarning: torch.gels is deprecated in favour of torch.lstsq and will be removed in the next release. Please use torch.lstsq instead.\n res = torch.gels(B, A)[0]

\n\n

-- Docs: https://docs.pytest.org/en/latest/warnings.html\n======= 32 failed, 141 passed, 14 skipped, 4 warnings in 407.68 seconds =======\n```

\n","meta":{"source":"GitHub","url":"https://github.com/pytorch/vision/issues/1229"},"_input_hash":1014989260,"_task_hash":1504580154,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"User Info Endpoint Handling Expects Key \"user\" in Response","meta":{"source":"GitHub","url":"https://github.com/Schine/MW-OAuth2Client/issues/3"},"label":"DOCUMENTATION","_input_hash":-1060927024,"_task_hash":808110389,"answer":"reject"} {"text":"Document how to actually run the example in the `usage` section","meta":{"source":"GitHub","url":"https://github.com/graphcool/chromeless/issues/52"},"label":"DOCUMENTATION","_input_hash":380838276,"_task_hash":-331072797,"answer":"accept"} {"text":"# Color Theme Bug\n\n* I am reporting a bug. I used the bug reporting system by using `SPC h I` in spacemacs. However, at the end after using `C-c C-c` to submit, it pops out a webpage with error. I am not sure if it has been successfully submitted so decide to paste it here anyway.\r\n\r\n---\r\n\r\n#### Description :octocat:\r\nI love the color theme \"gruvbox\", and have tried very hard to implement that.\r\nI followed the documentation and found a bug.\r\n\r\n#### Reproduction guide :beetle:\r\n- Edit ~/.spacemacs by adding 'gruvbox' in to the list of themes:\r\n ```\r\n ...\r\n dotspacemacs-themes '(gruvbox spacemacs spacemacs-light)'\r\n ...\r\n ```\r\n- Open emacs\r\n- Emacs seems to be downloading the theme\r\n- It works! (for only this time)\r\n- Close emacs\r\n- Open emacs again\r\n- An error pop out! (more in the observed behaviour section)\r\n- Close emacs, sadly\r\n- Remove \"gruvbox\" from the config file\r\n- Open emacs again. Emacs will remove the theme package it just installed.\r\n- Close emacs\r\n- Add \"gruvbox\" again in the config, the same way.\r\n- Open emacs, and it will download and load the theme correctly.\r\n- Close emacs, and openning it again gives the same error.\r\n- Repeat the above. The error came out 5 times, and I am convinced that this is a bug.\r\n\r\n*Observed behaviour:* :eyes: :broken_heart:\r\n- After opening emacs the second time, the theme \"gruvbox\" fails to be loaded.\r\n- The system falls back to the default theme, and outputs an error/warning message on the welcoming page, as follows:\r\n Warnings:\r\n - An error occurred while applying the theme \"gruvbox\", fallback on theme\r\n \"spacemacs-dark\". Error was: (file-missing Cannot open load file No such file or\r\n directory autothemer) \r\n - Please check the value of \"dotspacemacs-themes\" in your dotfile or open an issue\r\n so we can add support for the theme \"gruvbox\". \r\n\r\n\r\n*Expected behaviour:* :heart: :smile:\r\n- The system should (but not) work as the theme was freshly loaded.\r\n\r\n#### System Info :computer:\r\n- OS: gnu/linux\r\n- Emacs: 26.2\r\n- Spacemacs: 0.200.13\r\n- Spacemacs branch: master (rev. 8c0b8c344)\r\n- Graphic display: t\r\n- Distribution: spacemacs\r\n- Editing style: vim\r\n- Completion: helm\r\n- Layers:\r\n```elisp\r\n(helm emacs-lisp git markdown org)\r\n```\r\n- System configuration features: XPM JPEG TIFF GIF PNG RSVG IMAGEMAGICK SOUND GPM DBUS GSETTINGS GLIB NOTIFY ACL GNUTLS LIBXML2 FREETYPE M17N_FLT LIBOTF XFT ZLIB TOOLKIT_SCROLL_BARS GTK3 X11 XDBE XIM MODULES THREADS LIBSYSTEMD LCMS2\r\n\r\n\r\n#### Backtrace :paw_prints:\r\n```\r\n<>\r\n```\r\n","title":"Color Theme Bug","body":"* I am reporting a bug. I used the bug reporting system by using `SPC h I` in spacemacs. However, at the end after using `C-c C-c` to submit, it pops out a webpage with error. I am not sure if it has been successfully submitted so decide to paste it here anyway.\r\n\r\n---\r\n\r\n#### Description :octocat:\r\nI love the color theme \"gruvbox\", and have tried very hard to implement that.\r\nI followed the documentation and found a bug.\r\n\r\n#### Reproduction guide :beetle:\r\n- Edit ~/.spacemacs by adding 'gruvbox' in to the list of themes:\r\n ```\r\n ...\r\n dotspacemacs-themes '(gruvbox spacemacs spacemacs-light)'\r\n ...\r\n ```\r\n- Open emacs\r\n- Emacs seems to be downloading the theme\r\n- It works! (for only this time)\r\n- Close emacs\r\n- Open emacs again\r\n- An error pop out! (more in the observed behaviour section)\r\n- Close emacs, sadly\r\n- Remove \"gruvbox\" from the config file\r\n- Open emacs again. Emacs will remove the theme package it just installed.\r\n- Close emacs\r\n- Add \"gruvbox\" again in the config, the same way.\r\n- Open emacs, and it will download and load the theme correctly.\r\n- Close emacs, and openning it again gives the same error.\r\n- Repeat the above. The error came out 5 times, and I am convinced that this is a bug.\r\n\r\n*Observed behaviour:* :eyes: :broken_heart:\r\n- After opening emacs the second time, the theme \"gruvbox\" fails to be loaded.\r\n- The system falls back to the default theme, and outputs an error/warning message on the welcoming page, as follows:\r\n Warnings:\r\n - An error occurred while applying the theme \"gruvbox\", fallback on theme\r\n \"spacemacs-dark\". Error was: (file-missing Cannot open load file No such file or\r\n directory autothemer) \r\n - Please check the value of \"dotspacemacs-themes\" in your dotfile or open an issue\r\n so we can add support for the theme \"gruvbox\". \r\n\r\n\r\n*Expected behaviour:* :heart: :smile:\r\n- The system should (but not) work as the theme was freshly loaded.\r\n\r\n#### System Info :computer:\r\n- OS: gnu/linux\r\n- Emacs: 26.2\r\n- Spacemacs: 0.200.13\r\n- Spacemacs branch: master (rev. 8c0b8c344)\r\n- Graphic display: t\r\n- Distribution: spacemacs\r\n- Editing style: vim\r\n- Completion: helm\r\n- Layers:\r\n```elisp\r\n(helm emacs-lisp git markdown org)\r\n```\r\n- System configuration features: XPM JPEG TIFF GIF PNG RSVG IMAGEMAGICK SOUND GPM DBUS GSETTINGS GLIB NOTIFY ACL GNUTLS LIBXML2 FREETYPE M17N_FLT LIBOTF XFT ZLIB TOOLKIT_SCROLL_BARS GTK3 X11 XDBE XIM MODULES THREADS LIBSYSTEMD LCMS2\r\n\r\n\r\n#### Backtrace :paw_prints:\r\n```\r\n<>\r\n```\r\n","html":"

Color Theme Bug

\n\n
    \n
  • I am reporting a bug. I used the bug reporting system by using SPC h I in spacemacs. However, at the end after using C-c C-c to submit, it pops out a webpage with error. I am not sure if it has been successfully submitted so decide to paste it here anyway.
  • \n
\n\n
\n\n

Description :octocat:

\n\n

I love the color theme \"gruvbox\", and have tried very hard to implement that.\nI followed the documentation and found a bug.

\n\n

Reproduction guide :beetle:

\n\n
    \n
  • Edit ~/.spacemacs by adding 'gruvbox' in to the list of themes:\n\n...\ndotspacemacs-themes '(gruvbox spacemacs spacemacs-light)'\n...\n
  • \n
  • Open emacs
  • \n
  • Emacs seems to be downloading the theme
  • \n
  • It works! (for only this time)
  • \n
  • Close emacs
  • \n
  • Open emacs again
  • \n
  • An error pop out! (more in the observed behaviour section)
  • \n
  • Close emacs, sadly
  • \n
  • Remove \"gruvbox\" from the config file
  • \n
  • Open emacs again. Emacs will remove the theme package it just installed.
  • \n
  • Close emacs
  • \n
  • Add \"gruvbox\" again in the config, the same way.
  • \n
  • Open emacs, and it will download and load the theme correctly.
  • \n
  • Close emacs, and openning it again gives the same error.
  • \n
  • Repeat the above. The error came out 5 times, and I am convinced that this is a bug.
  • \n
\n\n

Observed behaviour: :eyes: :broken_heart:\n- After opening emacs the second time, the theme \"gruvbox\" fails to be loaded.\n- The system falls back to the default theme, and outputs an error/warning message on the welcoming page, as follows:\n Warnings:\n - An error occurred while applying the theme \"gruvbox\", fallback on theme\n \"spacemacs-dark\". Error was: (file-missing Cannot open load file No such file or\n directory autothemer)
\n - Please check the value of \"dotspacemacs-themes\" in your dotfile or open an issue\n so we can add support for the theme \"gruvbox\".

\n\n

Expected behaviour: :heart: :smile:\n- The system should (but not) work as the theme was freshly loaded.

\n\n

System Info :computer:

\n\n
    \n
  • OS: gnu/linux
  • \n
  • Emacs: 26.2
  • \n
  • Spacemacs: 0.200.13
  • \n
  • Spacemacs branch: master (rev. 8c0b8c344)
  • \n
  • Graphic display: t
  • \n
  • Distribution: spacemacs
  • \n
  • Editing style: vim
  • \n
  • Completion: helm
  • \n
  • Layers:\nelisp\n(helm emacs-lisp git markdown org)\n
  • \n
  • System configuration features: XPM JPEG TIFF GIF PNG RSVG IMAGEMAGICK SOUND GPM DBUS GSETTINGS GLIB NOTIFY ACL GNUTLS LIBXML2 FREETYPE M17NFLT LIBOTF XFT ZLIB TOOLKITSCROLL_BARS GTK3 X11 XDBE XIM MODULES THREADS LIBSYSTEMD LCMS2
  • \n
\n\n

Backtrace :paw_prints:

\n\n

\n<<BACKTRACE IF RELEVANT>>\n

\n","meta":{"source":"GitHub","url":"https://github.com/syl20bnr/spacemacs/issues/12613"},"_input_hash":-301537452,"_task_hash":587781196,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"`npm run pack` to build binaries is not working ","meta":{"source":"GitHub","url":"https://github.com/datahq/datahub-cli/issues/130"},"label":"DOCUMENTATION","_input_hash":1961344193,"_task_hash":1397168264,"answer":"reject"} {"text":"# Must have proxy servers as neutral for WDAG\n\nHostnames and aliases for proxy servers must be included in the \"Domains categorized as both work and personal\" policy (aka \"neutral\"). That's an important note left out of the documentation for that policy.\r\n\r\n---\r\n#### Document Details\r\n\r\n\u26a0 *Do not edit this section. It is required for docs.microsoft.com \u279f GitHub issue linking.*\r\n\r\n* ID: 41a8c802-79ca-1593-c045-2ed150e9ce40\r\n* Version Independent ID: 3dabd613-2eee-7c65-6f30-6746c8fb5bdf\r\n* Content: [Configure the Group Policy settings for Windows Defender Application Guard (Windows 10)](https://docs.microsoft.com/en-us/windows/security/threat-protection/windows-defender-application-guard/configure-wd-app-guard)\r\n* Content Source: [windows/security/threat-protection/windows-defender-application-guard/configure-wd-app-guard.md](https://github.com/MicrosoftDocs/windows-itpro-docs/blob/master/windows/security/threat-protection/windows-defender-application-guard/configure-wd-app-guard.md)\r\n* Product: **w10**\r\n* Technology: **windows**\r\n* GitHub Login: @Dansimp\r\n* Microsoft Alias: **dansimp**","title":"Must have proxy servers as neutral for WDAG","body":"Hostnames and aliases for proxy servers must be included in the \"Domains categorized as both work and personal\" policy (aka \"neutral\"). That's an important note left out of the documentation for that policy.\r\n\r\n---\r\n#### Document Details\r\n\r\n\u26a0 *Do not edit this section. It is required for docs.microsoft.com \u279f GitHub issue linking.*\r\n\r\n* ID: 41a8c802-79ca-1593-c045-2ed150e9ce40\r\n* Version Independent ID: 3dabd613-2eee-7c65-6f30-6746c8fb5bdf\r\n* Content: [Configure the Group Policy settings for Windows Defender Application Guard (Windows 10)](https://docs.microsoft.com/en-us/windows/security/threat-protection/windows-defender-application-guard/configure-wd-app-guard)\r\n* Content Source: [windows/security/threat-protection/windows-defender-application-guard/configure-wd-app-guard.md](https://github.com/MicrosoftDocs/windows-itpro-docs/blob/master/windows/security/threat-protection/windows-defender-application-guard/configure-wd-app-guard.md)\r\n* Product: **w10**\r\n* Technology: **windows**\r\n* GitHub Login: @Dansimp\r\n* Microsoft Alias: **dansimp**","html":"

Must have proxy servers as neutral for WDAG

\n\n

Hostnames and aliases for proxy servers must be included in the \"Domains categorized as both work and personal\" policy (aka \"neutral\"). That's an important note left out of the documentation for that policy.

\n\n
\n\n

Document Details

\n\n

\u26a0 Do not edit this section. It is required for docs.microsoft.com \u279f GitHub issue linking.

\n\n\n","meta":{"source":"GitHub","url":"https://github.com/MicrosoftDocs/windows-itpro-docs/issues/4668"},"_input_hash":1243948170,"_task_hash":2019101841,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Create examples and instructions on how to use this module","meta":{"source":"GitHub","url":"https://github.com/aviggiano/redis-roaring/issues/41"},"label":"DOCUMENTATION","_input_hash":-1579315758,"_task_hash":967662467,"answer":"accept"} {"text":"Do you have room for one more?","meta":{"source":"GitHub","url":"https://github.com/githubschool/open-enrollment-classes-introduction-to-github/issues/9007"},"label":"DOCUMENTATION","_input_hash":-1804470988,"_task_hash":1357481608,"answer":"reject"} {"text":"Use \"universal\" categories for part-of-speech tags (1.10.0)","meta":{"source":"GitHub","url":"https://github.com/dkpro/dkpro-core/issues/1088"},"label":"DOCUMENTATION","_input_hash":-2143876469,"_task_hash":-1672802163,"answer":"reject"} {"text":"# Mouse locking does not work\n\nHello,\r\n\r\nFollowing up on my \"Civilization\" project here: https://zmozgu.net/civ.html\r\n\r\nI wanted to enable mouse locking feature. I have followed the documentation and modified my static HTML file to create the \"dosbox.conf\" and pass the \"autolock=true\" into it. \r\n\r\nWeird thing happens - the OS cursor does not disappear after clicking and there is a position difference between the OS cursor and DOS cursor, that is visible in the game area - the further you move the OS cursor to the sides, the bigger the distance between it and the DOS cursor becomes. \r\n\r\nYou can check it by yourself - you have to wait until the game intro ends and the DOS cursor can be visible.\r\n\r\nWhat am I doing wrong?\r\n\r\nMy code is below:\r\n\r\n```\r\n\r\n \r\n \r\n\r\n\r\n\r\n```","title":"Mouse locking does not work","body":"Hello,\r\n\r\nFollowing up on my \"Civilization\" project here: https://zmozgu.net/civ.html\r\n\r\nI wanted to enable mouse locking feature. I have followed the documentation and modified my static HTML file to create the \"dosbox.conf\" and pass the \"autolock=true\" into it. \r\n\r\nWeird thing happens - the OS cursor does not disappear after clicking and there is a position difference between the OS cursor and DOS cursor, that is visible in the game area - the further you move the OS cursor to the sides, the bigger the distance between it and the DOS cursor becomes. \r\n\r\nYou can check it by yourself - you have to wait until the game intro ends and the DOS cursor can be visible.\r\n\r\nWhat am I doing wrong?\r\n\r\nMy code is below:\r\n\r\n```\r\n\r\n \r\n \r\n\r\n\r\n\r\n```","html":"

Mouse locking does not work

\n\n

Hello,

\n\n

Following up on my \"Civilization\" project here: https://zmozgu.net/civ.html

\n\n

I wanted to enable mouse locking feature. I have followed the documentation and modified my static HTML file to create the \"dosbox.conf\" and pass the \"autolock=true\" into it.

\n\n

Weird thing happens - the OS cursor does not disappear after clicking and there is a position difference between the OS cursor and DOS cursor, that is visible in the game area - the further you move the OS cursor to the sides, the bigger the distance between it and the DOS cursor becomes.

\n\n

You can check it by yourself - you have to wait until the game intro ends and the DOS cursor can be visible.

\n\n

What am I doing wrong?

\n\n

My code is below:

\n\n

```\n\n \n \n

\n\n

\n```

\n","meta":{"source":"GitHub","url":"https://github.com/caiiiycuk/js-dos/issues/63"},"_input_hash":-933002576,"_task_hash":-1792782116,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Issue when installing: ","meta":{"source":"GitHub","url":"https://github.com/strace/strace/issues/11"},"label":"DOCUMENTATION","_input_hash":380976947,"_task_hash":-120282505,"answer":"reject"} {"text":"Add a TMX parser","meta":{"source":"GitHub","url":"https://github.com/strinking/strikemon/issues/1"},"label":"DOCUMENTATION","_input_hash":-981387814,"_task_hash":1604708204,"answer":"reject"} {"text":"README","meta":{"source":"GitHub","url":"https://github.com/rpavlik/PS2-Breakout-Board/issues/3"},"label":"DOCUMENTATION","_input_hash":18546734,"_task_hash":127174695,"answer":"accept"} {"text":"QFlightInstruments README misleading / incomplete","meta":{"source":"GitHub","url":"https://github.com/JdeRobot/ThirdParty/issues/12"},"label":"DOCUMENTATION","_input_hash":-1992554639,"_task_hash":-681958394,"answer":"accept"} {"text":"Invalid reference links provided in documentation","meta":{"source":"GitHub","url":"https://github.com/okta/okta-sdk-java/issues/113"},"label":"DOCUMENTATION","_input_hash":-463839177,"_task_hash":770615733,"answer":"accept"} {"text":"# Create Topics index.\n\nNeed to add `// Topics:` tags to docs and generate topics index","title":"Create Topics index.","body":"Need to add `// Topics:` tags to docs and generate topics index","html":"

Create Topics index.

\n\n

Need to add // Topics: tags to docs and generate topics index

\n","meta":{"source":"GitHub","url":"https://github.com/revarbat/BOSL2/issues/88"},"_input_hash":-820054882,"_task_hash":609380141,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"performance of 2013-Haswell and 2017-SkylakePurley on Skylake-SP","meta":{"source":"GitHub","url":"https://github.com/Mysticial/Flops/issues/18"},"label":"DOCUMENTATION","_input_hash":-1767498199,"_task_hash":-475786604,"answer":"reject"} {"text":"# README update \n\n","title":"README update ","body":"","html":"

README update

\n","meta":{"source":"GitHub","url":"https://github.com/JiHaeK/videoTotext/issues/4"},"_input_hash":747804941,"_task_hash":1123014599,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# getDay doesn't let you set weekStartsOn\n\nHi there, thanks for maintaining this super useful library,\r\nI have a remark / request / issue regarding `getDay`:\r\n\r\n`getWeekOfMonth` accept options such as `locale, weekStartsOn`.\r\nhttps://date-fns.org/v2.0.0-beta.4/docs/getWeekOfMonth\r\n\r\nThis means that:\r\n```ts\r\nconst aug11_2019 = new Date(2019, 7, 11)\r\ngetWeekOfMonth(aug11_2019) === 3 // Sunday, third week of august\r\ngetWeekOfMonth(aug11_2019, {weeksStartsOn: 1) === 2 // Week starts on Monday, we're still on week 2\r\n```\r\n\r\n(supposedly, there's a known bug with Sundays: https://github.com/date-fns/date-fns/issues/1040)\r\n\r\nI expect the `getDay` function to let me set `locale, weeksStartsOn` too, so that:\r\n\r\n```ts\r\ngetDay(aug11_2019) === 0 // Sunday, first day of the week\r\ngetDay(aug11_2019, {weeksStartsOn: 1) === 6 // last day of the previous week\r\n```","title":"getDay doesn't let you set weekStartsOn","body":"Hi there, thanks for maintaining this super useful library,\r\nI have a remark / request / issue regarding `getDay`:\r\n\r\n`getWeekOfMonth` accept options such as `locale, weekStartsOn`.\r\nhttps://date-fns.org/v2.0.0-beta.4/docs/getWeekOfMonth\r\n\r\nThis means that:\r\n```ts\r\nconst aug11_2019 = new Date(2019, 7, 11)\r\ngetWeekOfMonth(aug11_2019) === 3 // Sunday, third week of august\r\ngetWeekOfMonth(aug11_2019, {weeksStartsOn: 1) === 2 // Week starts on Monday, we're still on week 2\r\n```\r\n\r\n(supposedly, there's a known bug with Sundays: https://github.com/date-fns/date-fns/issues/1040)\r\n\r\nI expect the `getDay` function to let me set `locale, weeksStartsOn` too, so that:\r\n\r\n```ts\r\ngetDay(aug11_2019) === 0 // Sunday, first day of the week\r\ngetDay(aug11_2019, {weeksStartsOn: 1) === 6 // last day of the previous week\r\n```","html":"

getDay doesn't let you set weekStartsOn

\n\n

Hi there, thanks for maintaining this super useful library,\nI have a remark / request / issue regarding getDay:

\n\n

getWeekOfMonth accept options such as locale, weekStartsOn.\nhttps://date-fns.org/v2.0.0-beta.4/docs/getWeekOfMonth

\n\n

This means that:\nts\nconst aug11_2019 = new Date(2019, 7, 11)\ngetWeekOfMonth(aug11_2019) === 3 // Sunday, third week of august\ngetWeekOfMonth(aug11_2019, {weeksStartsOn: 1) === 2 // Week starts on Monday, we're still on week 2\n

\n\n

(supposedly, there's a known bug with Sundays: https://github.com/date-fns/date-fns/issues/1040)

\n\n

I expect the getDay function to let me set locale, weeksStartsOn too, so that:

\n\n

ts\ngetDay(aug11_2019) === 0 // Sunday, first day of the week\ngetDay(aug11_2019, {weeksStartsOn: 1) === 6 // last day of the previous week\n

\n","meta":{"source":"GitHub","url":"https://github.com/date-fns/date-fns/issues/1287"},"_input_hash":1706853340,"_task_hash":-1037532304,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"0.1.32 seems to run fine also without host application","meta":{"source":"GitHub","url":"https://github.com/passff/passff/issues/216"},"label":"DOCUMENTATION","_input_hash":-830923609,"_task_hash":194505475,"answer":"reject"} {"text":"Consider publishing the API online","meta":{"source":"GitHub","url":"https://github.com/SebastianoF/bruker2nifti/issues/6"},"label":"DOCUMENTATION","_input_hash":-1601813447,"_task_hash":-906900108,"answer":"accept"} {"text":"Kinetic, compiling error","meta":{"source":"GitHub","url":"https://github.com/ethz-asl/nbvplanner/issues/18"},"label":"DOCUMENTATION","_input_hash":540284518,"_task_hash":1083565933,"answer":"reject"} {"text":"Add missing temp module for sme","meta":{"source":"GitHub","url":"https://github.com/mgechev/angular-seed/issues/2035"},"label":"DOCUMENTATION","_input_hash":1700160586,"_task_hash":-1944268761,"answer":"reject"} {"text":"REPL in the website ?","meta":{"source":"GitHub","url":"https://github.com/cljs/site/issues/26"},"label":"DOCUMENTATION","_input_hash":694440208,"_task_hash":1520323648,"answer":"accept"} {"text":"Segfault in tac_plus","meta":{"source":"GitHub","url":"https://github.com/facebook/tac_plus/issues/11"},"label":"DOCUMENTATION","_input_hash":-1906837944,"_task_hash":-1401233493,"answer":"reject"} {"text":"Mailchimp v3 use FNAME and LNAME for firstName and lastName ","meta":{"source":"GitHub","url":"https://github.com/spatie/laravel-newsletter/issues/99"},"label":"DOCUMENTATION","_input_hash":-1395146723,"_task_hash":-1272760511,"answer":"reject"} {"text":"# repsonse file option missing?\n\n### Which version of the AzCopy was used? \r\n10.2.1\r\n##### Note: The version is visible when running AzCopy without any argument\r\n\r\n### Which platform are you using? (ex: Windows, Mac, Linux)\r\nWindows\r\n\r\n### What command did you run?\r\nazcopy /@c:\\...\\x.responsefile\r\n\r\n##### Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.\r\n\r\n### What problem was encountered?\r\nThe standard help string showed up\r\n\r\n### How can we reproduce the problem in the simplest way?\r\nCall azcopy with the usual responsefile option\r\n\r\n### Have you found a mitigation/solution?\r\nNo. I can use the command line, but with all those special characters in the SAS, I found it impossible to put it into a batch file\r\n","title":"repsonse file option missing?","body":"### Which version of the AzCopy was used? \r\n10.2.1\r\n##### Note: The version is visible when running AzCopy without any argument\r\n\r\n### Which platform are you using? (ex: Windows, Mac, Linux)\r\nWindows\r\n\r\n### What command did you run?\r\nazcopy /@c:\\...\\x.responsefile\r\n\r\n##### Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.\r\n\r\n### What problem was encountered?\r\nThe standard help string showed up\r\n\r\n### How can we reproduce the problem in the simplest way?\r\nCall azcopy with the usual responsefile option\r\n\r\n### Have you found a mitigation/solution?\r\nNo. I can use the command line, but with all those special characters in the SAS, I found it impossible to put it into a batch file\r\n","html":"

repsonse file option missing?

\n\n

Which version of the AzCopy was used?

\n\n

10.2.1

\n\n
Note: The version is visible when running AzCopy without any argument
\n\n

Which platform are you using? (ex: Windows, Mac, Linux)

\n\n

Windows

\n\n

What command did you run?

\n\n

azcopy /@c:...\\x.responsefile

\n\n
Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.
\n\n

What problem was encountered?

\n\n

The standard help string showed up

\n\n

How can we reproduce the problem in the simplest way?

\n\n

Call azcopy with the usual responsefile option

\n\n

Have you found a mitigation/solution?

\n\n

No. I can use the command line, but with all those special characters in the SAS, I found it impossible to put it into a batch file

\n","meta":{"source":"GitHub","url":"https://github.com/Azure/azure-storage-azcopy/issues/543"},"_input_hash":-1038645938,"_task_hash":1544763971,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Add contact details to Google Doc","meta":{"source":"GitHub","url":"https://github.com/159356-1702-Extramural/capstone/issues/9"},"label":"DOCUMENTATION","_input_hash":-1995482138,"_task_hash":-1031908930,"answer":"reject"} {"text":"# Broken link\n\nA schematic of the original hardware can be found here:\r\nhttp://swhs.home.xs4all.nl/bbc/mmbeeb/\r\nThis link is broken non existent in the documentation readme.txt.","title":"Broken link","body":"A schematic of the original hardware can be found here:\r\nhttp://swhs.home.xs4all.nl/bbc/mmbeeb/\r\nThis link is broken non existent in the documentation readme.txt.","html":"

Broken link

\n\n

A schematic of the original hardware can be found here:\nhttp://swhs.home.xs4all.nl/bbc/mmbeeb/\nThis link is broken non existent in the documentation readme.txt.

\n","meta":{"source":"GitHub","url":"https://github.com/hoglet67/MMFS/issues/15"},"_input_hash":-1396635589,"_task_hash":-40231009,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"CF v269","meta":{"source":"GitHub","url":"https://github.com/cloudfoundry/cf-final-release-election/issues/42"},"label":"DOCUMENTATION","_input_hash":994070213,"_task_hash":520499459,"answer":"reject"} {"text":"Fix download command for README.","meta":{"source":"GitHub","url":"https://github.com/AkkeyLab/mac-auto-setup/issues/58"},"label":"DOCUMENTATION","_input_hash":314391217,"_task_hash":107547754,"answer":"accept"} {"text":"# CherryPy - Pytest hangs with default \"interactive\" mode \n\n**I'm submitting a ...**\r\n- [ ] bug report\r\n- [X] feature request\r\n- [ ] question about the decisions made in the repository\r\n\r\n**Do you want to request a *feature* or report a *bug*?**\r\nA \"feature\" request, mostly for the documentation rather than for the actual code of Cherrypy.\r\n\r\n**What is the current behavior?**\r\nRunning Cherrypy helper.CPWebCase tests hangs on failed assertions due to interactive mode, that can be disabled with helper.CPWebCase.interactive = False or through an enviroment variable (WEBTEST_INTERACTIVE).\r\nBut I had to look at the source code of CPWebCase to understand what was going on, with trial and error. I was not able to find documentation about this behaviour until I found this page: https://schneide.blog/2017/02/06/integration-tests-with-cherrypy-and-requests/ that reported my exact problem (read the first lines of the article).\r\n\r\n**If the current behavior is a bug, please provide the steps to reproduce and if possible a screenshots and logs of the problem. If you can, show us your code.**\r\n\r\nA minimal example to reproduce the problem (tries to get a not existing url):\r\n\r\n```\r\nfrom cherrypy.test import helper\r\n\r\nclass TestSample(helper.CPWebCase):\r\n\r\n def test_sample(self):\r\n self.getPage('/')\r\n self.assertStatus(\"200 OK\")\r\n```\r\n\r\nThe test will hang forever if \"interactive\" is True.\r\n\r\n**What is the expected behavior?**\r\nPlease tell why \"interactive\" is true by default and why there is no documentation about the possibility to disable it/how to disable it.\r\n\r\n**Please tell us about your environment:**\r\n\r\n- Cheroot version: 6.5.5\r\n- CherryPy version: 18.1.2\r\n- Python version: 3.7.3\r\n- OS: Windows\r\n- Browser: not relevant.\r\n- pytest: 5.0.1","title":"CherryPy - Pytest hangs with default \"interactive\" mode ","body":"**I'm submitting a ...**\r\n- [ ] bug report\r\n- [X] feature request\r\n- [ ] question about the decisions made in the repository\r\n\r\n**Do you want to request a *feature* or report a *bug*?**\r\nA \"feature\" request, mostly for the documentation rather than for the actual code of Cherrypy.\r\n\r\n**What is the current behavior?**\r\nRunning Cherrypy helper.CPWebCase tests hangs on failed assertions due to interactive mode, that can be disabled with helper.CPWebCase.interactive = False or through an enviroment variable (WEBTEST_INTERACTIVE).\r\nBut I had to look at the source code of CPWebCase to understand what was going on, with trial and error. I was not able to find documentation about this behaviour until I found this page: https://schneide.blog/2017/02/06/integration-tests-with-cherrypy-and-requests/ that reported my exact problem (read the first lines of the article).\r\n\r\n**If the current behavior is a bug, please provide the steps to reproduce and if possible a screenshots and logs of the problem. If you can, show us your code.**\r\n\r\nA minimal example to reproduce the problem (tries to get a not existing url):\r\n\r\n```\r\nfrom cherrypy.test import helper\r\n\r\nclass TestSample(helper.CPWebCase):\r\n\r\n def test_sample(self):\r\n self.getPage('/')\r\n self.assertStatus(\"200 OK\")\r\n```\r\n\r\nThe test will hang forever if \"interactive\" is True.\r\n\r\n**What is the expected behavior?**\r\nPlease tell why \"interactive\" is true by default and why there is no documentation about the possibility to disable it/how to disable it.\r\n\r\n**Please tell us about your environment:**\r\n\r\n- Cheroot version: 6.5.5\r\n- CherryPy version: 18.1.2\r\n- Python version: 3.7.3\r\n- OS: Windows\r\n- Browser: not relevant.\r\n- pytest: 5.0.1","html":"

CherryPy - Pytest hangs with default \"interactive\" mode

\n\n

I'm submitting a ...\n- [ ] bug report\n- [X] feature request\n- [ ] question about the decisions made in the repository

\n\n

Do you want to request a feature or report a bug?\nA \"feature\" request, mostly for the documentation rather than for the actual code of Cherrypy.

\n\n

What is the current behavior?\nRunning Cherrypy helper.CPWebCase tests hangs on failed assertions due to interactive mode, that can be disabled with helper.CPWebCase.interactive = False or through an enviroment variable (WEBTEST_INTERACTIVE).\nBut I had to look at the source code of CPWebCase to understand what was going on, with trial and error. I was not able to find documentation about this behaviour until I found this page: https://schneide.blog/2017/02/06/integration-tests-with-cherrypy-and-requests/ that reported my exact problem (read the first lines of the article).

\n\n

If the current behavior is a bug, please provide the steps to reproduce and if possible a screenshots and logs of the problem. If you can, show us your code.

\n\n

A minimal example to reproduce the problem (tries to get a not existing url):

\n\n

```\nfrom cherrypy.test import helper

\n\n

class TestSample(helper.CPWebCase):

\n\n
def test_sample(self):\n    self.getPage('/')\n    self.assertStatus(\"200 OK\")\n
\n\n

```

\n\n

The test will hang forever if \"interactive\" is True.

\n\n

What is the expected behavior?\nPlease tell why \"interactive\" is true by default and why there is no documentation about the possibility to disable it/how to disable it.

\n\n

Please tell us about your environment:

\n\n
    \n
  • Cheroot version: 6.5.5
  • \n
  • CherryPy version: 18.1.2
  • \n
  • Python version: 3.7.3
  • \n
  • OS: Windows
  • \n
  • Browser: not relevant.
  • \n
  • pytest: 5.0.1
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/cherrypy/cherrypy/issues/1799"},"_input_hash":-1432683765,"_task_hash":-482709977,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# Custom Commands that return void are still chainable.\n\nWhen I have a custom command defined as:\r\n```\r\nmyCommand = () => { \r\nreturn void;\r\n}\r\n```\r\nAnd I use it like so:\r\n```\r\nCy.myCommand().pause(). \r\n```\r\n\r\nI would expect to hit a run time error since my custom command `myCommand` doesn't return a `Cypress.Chainable`,\r\n\r\nBut instead the `pause()` command executes after I chain if off of my void method, and I am able to continue chaining even further.\r\n\r\n","title":"Custom Commands that return void are still chainable.","body":"When I have a custom command defined as:\r\n```\r\nmyCommand = () => { \r\nreturn void;\r\n}\r\n```\r\nAnd I use it like so:\r\n```\r\nCy.myCommand().pause(). \r\n```\r\n\r\nI would expect to hit a run time error since my custom command `myCommand` doesn't return a `Cypress.Chainable`,\r\n\r\nBut instead the `pause()` command executes after I chain if off of my void method, and I am able to continue chaining even further.\r\n\r\n","html":"

Custom Commands that return void are still chainable.

\n\n

When I have a custom command defined as:\n\nmyCommand = () => { \nreturn void;\n}\n\nAnd I use it like so:\n\nCy.myCommand().pause(). \n

\n\n

I would expect to hit a run time error since my custom command myCommand doesn't return a Cypress.Chainable<T>,

\n\n

But instead the pause() command executes after I chain if off of my void method, and I am able to continue chaining even further.

\n","meta":{"source":"GitHub","url":"https://github.com/cypress-io/cypress/issues/4968"},"_input_hash":1726183708,"_task_hash":-62291832,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Bind Control Click to Command Click?","meta":{"source":"GitHub","url":"https://github.com/Hammerspoon/hammerspoon/issues/1503"},"label":"DOCUMENTATION","_input_hash":-1119226228,"_task_hash":-916101097,"answer":"reject"} {"text":"Old README","meta":{"source":"GitHub","url":"https://github.com/SEL4PROJ/camkes-arm-vm/issues/2"},"label":"DOCUMENTATION","_input_hash":1164078021,"_task_hash":-1388143760,"answer":"accept"} {"text":"Instructions for using the markdown editor","meta":{"source":"GitHub","url":"https://github.com/timitoc/LearningPointers/issues/2"},"label":"DOCUMENTATION","_input_hash":-997130370,"_task_hash":-210660499,"answer":"accept"} {"text":"Update to v4.2","meta":{"source":"GitHub","url":"https://github.com/awesomeWM/awesome/issues/1943"},"label":"DOCUMENTATION","_input_hash":-339822489,"_task_hash":-734455376,"answer":"reject"} {"text":"How to compile on arm?","meta":{"source":"GitHub","url":"https://github.com/teeworlds/teeworlds/issues/1497"},"label":"DOCUMENTATION","_input_hash":-1102171850,"_task_hash":174054912,"answer":"reject"} {"text":"Inconsistent strategy for .gitignore and deploy-exclude.txt files","meta":{"source":"GitHub","url":"https://github.com/acquia/blt/issues/1839"},"label":"DOCUMENTATION","_input_hash":-2029444799,"_task_hash":1783335535,"answer":"reject"} {"text":"Markdown link open a new tab/window","meta":{"source":"GitHub","url":"https://github.com/swagger-api/swagger-ui/issues/3473"},"label":"DOCUMENTATION","_input_hash":-1816525142,"_task_hash":229590867,"answer":"reject"} {"text":"README.md img","meta":{"source":"GitHub","url":"https://github.com/suroxdesigns/orion/issues/1"},"label":"DOCUMENTATION","_input_hash":-1781812512,"_task_hash":-452809614,"answer":"accept"} {"text":"Development of the library with own solutions","meta":{"source":"GitHub","url":"https://github.com/MIT-LCP/wfdb-python/issues/65"},"label":"DOCUMENTATION","_input_hash":-1700781630,"_task_hash":-1804773990,"answer":"reject"} {"text":"asjp dist files","meta":{"source":"GitHub","url":"https://github.com/ddediu/lgfam-newick/issues/4"},"label":"DOCUMENTATION","_input_hash":-1055929253,"_task_hash":-2036658294,"answer":"reject"} {"text":"Noble was installed on RPi0 but I can't run it ","meta":{"source":"GitHub","url":"https://github.com/sandeepmistry/noble/issues/659"},"label":"DOCUMENTATION","_input_hash":-1596641324,"_task_hash":193079526,"answer":"reject"} {"text":"# Config\n\nI found a few things that could improve the API reviewing the project:\r\n\r\n1. Documentational comments explaining your javascript functions and less readable components in them\r\n\r\n2. Quite nuanced, but having your database first in the compose file will cause it to start first, meaning that you can have the database running without the app, which might cause `port already in use errors` on restart and using resources to maintain the Mongo server without it being used.\r\n\r\n3. If you don't have the Docker containers hard coded to each other directly or through env variables, this poses a security risk in production, especially with the database. Docker can recognize env vars and files at runtime. \r\n\r\n4. I'm not sure how Docker does with cookies out of the box, but if you have issues with authentication try setting up a reverse proxy to pass along headers and cookies, and prevent CORS errors. \r\n\r\n5. Architectural- bin is for built executables (binaries), such as bin/www (a conventional name for web server startup scripts). Did you mean `src` when naming this folder? The architecture could be improved by delegating different folders to more broad functionality. \r\n\r\nOtherwise, very well written code, consistently great use of syntactic sugar and the decisions you made in tooling + functionality show sophisticated design and research. Refactor the architecture and the project will shine. ","title":"Config","body":"I found a few things that could improve the API reviewing the project:\r\n\r\n1. Documentational comments explaining your javascript functions and less readable components in them\r\n\r\n2. Quite nuanced, but having your database first in the compose file will cause it to start first, meaning that you can have the database running without the app, which might cause `port already in use errors` on restart and using resources to maintain the Mongo server without it being used.\r\n\r\n3. If you don't have the Docker containers hard coded to each other directly or through env variables, this poses a security risk in production, especially with the database. Docker can recognize env vars and files at runtime. \r\n\r\n4. I'm not sure how Docker does with cookies out of the box, but if you have issues with authentication try setting up a reverse proxy to pass along headers and cookies, and prevent CORS errors. \r\n\r\n5. Architectural- bin is for built executables (binaries), such as bin/www (a conventional name for web server startup scripts). Did you mean `src` when naming this folder? The architecture could be improved by delegating different folders to more broad functionality. \r\n\r\nOtherwise, very well written code, consistently great use of syntactic sugar and the decisions you made in tooling + functionality show sophisticated design and research. Refactor the architecture and the project will shine. ","html":"

Config

\n\n

I found a few things that could improve the API reviewing the project:

\n\n
    \n
  1. Documentational comments explaining your javascript functions and less readable components in them

  2. \n
  3. Quite nuanced, but having your database first in the compose file will cause it to start first, meaning that you can have the database running without the app, which might cause port already in use errors on restart and using resources to maintain the Mongo server without it being used.

  4. \n
  5. If you don't have the Docker containers hard coded to each other directly or through env variables, this poses a security risk in production, especially with the database. Docker can recognize env vars and files at runtime.

  6. \n
  7. I'm not sure how Docker does with cookies out of the box, but if you have issues with authentication try setting up a reverse proxy to pass along headers and cookies, and prevent CORS errors.

  8. \n
  9. Architectural- bin is for built executables (binaries), such as bin/www (a conventional name for web server startup scripts). Did you mean src when naming this folder? The architecture could be improved by delegating different folders to more broad functionality.

  10. \n
\n\n

Otherwise, very well written code, consistently great use of syntactic sugar and the decisions you made in tooling + functionality show sophisticated design and research. Refactor the architecture and the project will shine.

\n","meta":{"source":"GitHub","url":"https://github.com/ThomasLee94/southpark-api/issues/2"},"_input_hash":-1730915020,"_task_hash":1857820270,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"sles12 sp2 installation loops after first stage. Never gets to stage 2.","meta":{"source":"GitHub","url":"https://github.com/cobbler/cobbler/issues/1815"},"label":"DOCUMENTATION","_input_hash":-820177493,"_task_hash":-474151459,"answer":"reject"} {"text":"# File handler: trailing newline position is not defined \n\nI've done an error here: https://github.com/B2W-BIT/aiologger/blob/master/aiologger/handlers/files.py#L79\r\n\r\nAs you told in the docs, write order is not guaranteed. That's why sometimes 2 log lines concats, and then there are doing 2 empty lines. I'll return str concatenation back","title":"File handler: trailing newline position is not defined ","body":"I've done an error here: https://github.com/B2W-BIT/aiologger/blob/master/aiologger/handlers/files.py#L79\r\n\r\nAs you told in the docs, write order is not guaranteed. That's why sometimes 2 log lines concats, and then there are doing 2 empty lines. I'll return str concatenation back","html":"

File handler: trailing newline position is not defined

\n\n

I've done an error here: https://github.com/B2W-BIT/aiologger/blob/master/aiologger/handlers/files.py#L79

\n\n

As you told in the docs, write order is not guaranteed. That's why sometimes 2 log lines concats, and then there are doing 2 empty lines. I'll return str concatenation back

\n","meta":{"source":"GitHub","url":"https://github.com/B2W-BIT/aiologger/issues/78"},"_input_hash":551200969,"_task_hash":-2118119897,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Notebooks not executing","meta":{"source":"GitHub","url":"https://github.com/spatialaudio/nbsphinx/issues/117"},"label":"DOCUMENTATION","_input_hash":-147741326,"_task_hash":536377071,"answer":"reject"} {"text":"Exclude Debug","meta":{"source":"GitHub","url":"https://github.com/xd009642/tarpaulin/issues/18"},"label":"DOCUMENTATION","_input_hash":-1898129677,"_task_hash":1914118098,"answer":"reject"} {"text":"Loading problem","meta":{"source":"GitHub","url":"https://github.com/DavideViolante/Angular-Full-Stack/issues/92"},"label":"DOCUMENTATION","_input_hash":420242024,"_task_hash":-747234108,"answer":"reject"} {"text":"documentation: `vars` option is missing for e.g. follow","meta":{"source":"GitHub","url":"https://github.com/aixigo/hal-http-client/issues/14"},"label":"DOCUMENTATION","_input_hash":-733660292,"_task_hash":-607341400,"answer":"accept"} {"text":"loops-test.js and instructions don't match for whileLoop(n)","meta":{"source":"GitHub","url":"https://github.com/learn-co-curriculum/javascript-intro-to-looping/issues/34"},"label":"DOCUMENTATION","_input_hash":-654241650,"_task_hash":673681562,"answer":"accept"} {"text":"# Documentation\n\nAdd documentation for available events and functions","title":"Documentation","body":"Add documentation for available events and functions","html":"

Documentation

\n\n

Add documentation for available events and functions

\n","meta":{"source":"GitHub","url":"https://github.com/kniffen/TruckSim-Telemetry/issues/3"},"_input_hash":-1723430974,"_task_hash":-1632457284,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"set_opacity not working for Circle edge","meta":{"source":"GitHub","url":"https://github.com/WorldWideTelescope/wwt-web-client/issues/159"},"label":"DOCUMENTATION","_input_hash":1449688234,"_task_hash":859184335,"answer":"reject"} {"text":"dp.kinect not registered after writing registration name and opening .dpreg ","meta":{"source":"GitHub","url":"https://github.com/diablodale/dp.kinect/issues/46"},"label":"DOCUMENTATION","_input_hash":565080742,"_task_hash":-467619989,"answer":"reject"} {"text":"# Private registry key usage\n\n### Description\r\nNot sure if this warrants a new issue given its closely related to the already closed [#511 ](https://github.com/polyaxon/polyaxon/issues/511), but I think I've got the other side of the issue here.\r\n Also, total kubernetes newcomer, so apologies if this is terribly dumb. I'm trying to use a private registry (ECR) instead of the in-cluster docker registry for both pushing and pulling images. \r\n\r\n### To Reproduce\r\nI added the secret with kubectl:\r\n`kubectl create secret generic dconf --from-file=sec_config.json=./sec_config.json -n polyaxon`\r\n\r\nwhere the file was like `{\"credsStore\":\"reallyLongPasssword\"}`, with the password coming from `aws ecr get-login`. \r\n\r\n### Expected behavior\r\nIf I understood the docs correctly, it looks like the key should just be listed in the UI, but I don't see anything there, nor was there anything after I ran an upgrade using the polyaxon CLI tool. \r\n\r\nAny help would be much appreciated!\r\n\r\n### Environment\r\npolyaxon 0.5.5\r\nkubernetes+helm","title":"Private registry key usage","body":"### Description\r\nNot sure if this warrants a new issue given its closely related to the already closed [#511 ](https://github.com/polyaxon/polyaxon/issues/511), but I think I've got the other side of the issue here.\r\n Also, total kubernetes newcomer, so apologies if this is terribly dumb. I'm trying to use a private registry (ECR) instead of the in-cluster docker registry for both pushing and pulling images. \r\n\r\n### To Reproduce\r\nI added the secret with kubectl:\r\n`kubectl create secret generic dconf --from-file=sec_config.json=./sec_config.json -n polyaxon`\r\n\r\nwhere the file was like `{\"credsStore\":\"reallyLongPasssword\"}`, with the password coming from `aws ecr get-login`. \r\n\r\n### Expected behavior\r\nIf I understood the docs correctly, it looks like the key should just be listed in the UI, but I don't see anything there, nor was there anything after I ran an upgrade using the polyaxon CLI tool. \r\n\r\nAny help would be much appreciated!\r\n\r\n### Environment\r\npolyaxon 0.5.5\r\nkubernetes+helm","html":"

Private registry key usage

\n\n

Description

\n\n

Not sure if this warrants a new issue given its closely related to the already closed #511 , but I think I've got the other side of the issue here.\n Also, total kubernetes newcomer, so apologies if this is terribly dumb. I'm trying to use a private registry (ECR) instead of the in-cluster docker registry for both pushing and pulling images.

\n\n

To Reproduce

\n\n

I added the secret with kubectl:\nkubectl create secret generic dconf --from-file=sec_config.json=./sec_config.json -n polyaxon

\n\n

where the file was like {\"credsStore\":\"reallyLongPasssword\"}, with the password coming from aws ecr get-login.

\n\n

Expected behavior

\n\n

If I understood the docs correctly, it looks like the key should just be listed in the UI, but I don't see anything there, nor was there anything after I ran an upgrade using the polyaxon CLI tool.

\n\n

Any help would be much appreciated!

\n\n

Environment

\n\n

polyaxon 0.5.5\nkubernetes+helm

\n","meta":{"source":"GitHub","url":"https://github.com/polyaxon/polyaxon/issues/552"},"_input_hash":2082739513,"_task_hash":-1203539624,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Training on VOC2011","meta":{"source":"GitHub","url":"https://github.com/JihongJu/keras-fcn/issues/25"},"label":"DOCUMENTATION","_input_hash":365926736,"_task_hash":1188992007,"answer":"reject"} {"text":"RHOAR Integration","meta":{"source":"GitHub","url":"https://github.com/bucharest-gold/entente/issues/171"},"label":"DOCUMENTATION","_input_hash":1629301801,"_task_hash":-1448385730,"answer":"reject"} {"text":"# url-prefix option is needed\n\nIf you want to deploy your documentation to gh-pages, for instance, you will have the following base url: `https://something.github.io/yourreponame`. The problem is that all urls generated by `Perl6::Documentable` are relative to the root source, which in this case, is `https://something.github.io/`.\r\n\r\nSo an easy solution is to provide a new option to add an arbitrary prefix to all urls, through `&rewrite-url`.","title":"url-prefix option is needed","body":"If you want to deploy your documentation to gh-pages, for instance, you will have the following base url: `https://something.github.io/yourreponame`. The problem is that all urls generated by `Perl6::Documentable` are relative to the root source, which in this case, is `https://something.github.io/`.\r\n\r\nSo an easy solution is to provide a new option to add an arbitrary prefix to all urls, through `&rewrite-url`.","html":"

url-prefix option is needed

\n\n

If you want to deploy your documentation to gh-pages, for instance, you will have the following base url: https://something.github.io/yourreponame. The problem is that all urls generated by Perl6::Documentable are relative to the root source, which in this case, is https://something.github.io/.

\n\n

So an easy solution is to provide a new option to add an arbitrary prefix to all urls, through &rewrite-url.

\n","meta":{"source":"GitHub","url":"https://github.com/antoniogamiz/Perl6-Documentable/issues/90"},"_input_hash":-1240819933,"_task_hash":-1406064261,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# Discord link broken?\n\nNot really an issue, but both the discord invite on the README.md page and on the website are broken.","title":"Discord link broken?","body":"Not really an issue, but both the discord invite on the README.md page and on the website are broken.","html":"

Discord link broken?

\n\n

Not really an issue, but both the discord invite on the README.md page and on the website are broken.

\n","meta":{"source":"GitHub","url":"https://github.com/minecraft-dev/MinecraftDev/issues/645"},"_input_hash":-1326257002,"_task_hash":-1076619905,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"Add actual exercises","meta":{"source":"GitHub","url":"https://github.com/Awlexus/HappyHandBreaks.Android/issues/3"},"label":"DOCUMENTATION","_input_hash":-480879661,"_task_hash":718678623,"answer":"reject"} {"text":"# ejercicio 2 workshop endpoint\n\ndejo la rama que cree para las pruebas del endpoint de mercado libre https://github.com/devjaime/clone-mercadolibre\r\n\r\nEstoy modificando el tama\u00f1o de la letra el tema, y otras cosas para que quede m\u00e1s presentable.\r\nEn el readme estoy anotando todo lo aprendido, \r\nPara este ejemplo agregue context solo porque me resultaba m\u00e1s f\u00e1cil leer el c\u00f3digo. Consulta si no ocupara axios, podria ocupar fetch directamente?\r\nhttps://es.reactjs.org/docs/context.html\r\n\r\n![image](https://user-images.githubusercontent.com/26843824/62831201-6521d180-bbe9-11e9-90f4-11bb628c9acb.png)\r\n\r\n","title":"ejercicio 2 workshop endpoint","body":"dejo la rama que cree para las pruebas del endpoint de mercado libre https://github.com/devjaime/clone-mercadolibre\r\n\r\nEstoy modificando el tama\u00f1o de la letra el tema, y otras cosas para que quede m\u00e1s presentable.\r\nEn el readme estoy anotando todo lo aprendido, \r\nPara este ejemplo agregue context solo porque me resultaba m\u00e1s f\u00e1cil leer el c\u00f3digo. Consulta si no ocupara axios, podria ocupar fetch directamente?\r\nhttps://es.reactjs.org/docs/context.html\r\n\r\n![image](https://user-images.githubusercontent.com/26843824/62831201-6521d180-bbe9-11e9-90f4-11bb628c9acb.png)\r\n\r\n","html":"

ejercicio 2 workshop endpoint

\n\n

dejo la rama que cree para las pruebas del endpoint de mercado libre https://github.com/devjaime/clone-mercadolibre

\n\n

Estoy modificando el tama\u00f1o de la letra el tema, y otras cosas para que quede m\u00e1s presentable.\nEn el readme estoy anotando todo lo aprendido, \nPara este ejemplo agregue context solo porque me resultaba m\u00e1s f\u00e1cil leer el c\u00f3digo. Consulta si no ocupara axios, podria ocupar fetch directamente?\nhttps://es.reactjs.org/docs/context.html

\n\n

\"image\"

\n","meta":{"source":"GitHub","url":"https://github.com/mercadolibre-workshops/react-workshop/issues/3"},"_input_hash":1131689420,"_task_hash":-1466656023,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Update documentation with get_datacenter_by_name method","meta":{"source":"GitHub","url":"https://github.com/profitbricks/profitbricks-sdk-python/issues/68"},"label":"DOCUMENTATION","_input_hash":94024930,"_task_hash":1192908390,"answer":"accept"} {"text":"Add \"Airports Web\" API","meta":{"source":"GitHub","url":"https://github.com/APIs-guru/openapi-directory/issues/246"},"label":"DOCUMENTATION","_input_hash":-1995752479,"_task_hash":57615958,"answer":"reject"} {"text":"# Add CloudWatch agent for disk space and memory monitoring\n\nFor users that don't have New Relic or similar tools, we want to provide the ability to monitor disk space and memory usage by enabling those metrics.\r\n\r\nCloudWatch agent https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/Install-CloudWatch-Agent.html provides disk and memory metrics https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/metrics-collected-by-CloudWatch-agent.html, and there are some Puppet modules that seem to support the provisioning of CloudWatch monitoring.\r\nOne thing to note is that we would like to support the provisioning of CloudWatch monitoring on RHEL7 and Amazon Linux 2 on Packer AEM.\r\n\r\nConsistent to other feature flags, please add a configuration property `aws.install_cloudwatchagent` to enable/disable CloudWatch agent installation.\r\n\r\n","title":"Add CloudWatch agent for disk space and memory monitoring","body":"For users that don't have New Relic or similar tools, we want to provide the ability to monitor disk space and memory usage by enabling those metrics.\r\n\r\nCloudWatch agent https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/Install-CloudWatch-Agent.html provides disk and memory metrics https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/metrics-collected-by-CloudWatch-agent.html, and there are some Puppet modules that seem to support the provisioning of CloudWatch monitoring.\r\nOne thing to note is that we would like to support the provisioning of CloudWatch monitoring on RHEL7 and Amazon Linux 2 on Packer AEM.\r\n\r\nConsistent to other feature flags, please add a configuration property `aws.install_cloudwatchagent` to enable/disable CloudWatch agent installation.\r\n\r\n","html":"

Add CloudWatch agent for disk space and memory monitoring

\n\n

For users that don't have New Relic or similar tools, we want to provide the ability to monitor disk space and memory usage by enabling those metrics.

\n\n

CloudWatch agent https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/Install-CloudWatch-Agent.html provides disk and memory metrics https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/metrics-collected-by-CloudWatch-agent.html, and there are some Puppet modules that seem to support the provisioning of CloudWatch monitoring.\nOne thing to note is that we would like to support the provisioning of CloudWatch monitoring on RHEL7 and Amazon Linux 2 on Packer AEM.

\n\n

Consistent to other feature flags, please add a configuration property aws.install_cloudwatchagent to enable/disable CloudWatch agent installation.

\n","meta":{"source":"GitHub","url":"https://github.com/shinesolutions/packer-aem/issues/166"},"_input_hash":498560581,"_task_hash":-1002173864,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"auto refresh page feature?","meta":{"source":"GitHub","url":"https://github.com/lihaoyi/Scalatex/issues/59"},"label":"DOCUMENTATION","_input_hash":-1063992031,"_task_hash":835052103,"answer":"reject"} {"text":"Edit README","meta":{"source":"GitHub","url":"https://github.com/malpercio/sails-industrial-factory/issues/2"},"label":"DOCUMENTATION","_input_hash":1234771079,"_task_hash":2113657916,"answer":"accept"} {"text":"# Add Call of Duty 4 : Modern Warfare auto splitter\n\nhttps://raw.githubusercontent.com/KunoDemetries/cod4/master/mw.asl\r\n\r\nI wasn't sure how to add it so I made it into a repositories file. \r\n\r\nit has start, split, reset, and remove loads.","title":"Add Call of Duty 4 : Modern Warfare auto splitter","body":"https://raw.githubusercontent.com/KunoDemetries/cod4/master/mw.asl\r\n\r\nI wasn't sure how to add it so I made it into a repositories file. \r\n\r\nit has start, split, reset, and remove loads.","html":"

Add Call of Duty 4 : Modern Warfare auto splitter

\n\n

https://raw.githubusercontent.com/KunoDemetries/cod4/master/mw.asl

\n\n

I wasn't sure how to add it so I made it into a repositories file.

\n\n

it has start, split, reset, and remove loads.

\n","meta":{"source":"GitHub","url":"https://github.com/LiveSplit/LiveSplit/issues/1453"},"_input_hash":849620909,"_task_hash":630770684,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Update DOCS for security_groups in aws_instance","meta":{"source":"GitHub","url":"https://github.com/terraform-providers/terraform-provider-aws/issues/1249"},"label":"DOCUMENTATION","_input_hash":785967992,"_task_hash":985015271,"answer":"accept"} {"text":"Template Docs: 'packagename' replacer broken?","meta":{"source":"GitHub","url":"https://github.com/astropy/astropy/issues/6396"},"label":"DOCUMENTATION","_input_hash":1578567653,"_task_hash":-14159993,"answer":"accept"} {"text":"Improve documentation for service and token urls","meta":{"source":"GitHub","url":"https://github.com/watson-developer-cloud/swift-sdk/issues/657"},"label":"DOCUMENTATION","_input_hash":310096459,"_task_hash":481176052,"answer":"accept"} {"text":"# try chunk store\n\ntry indexeddb for chunk store\r\n\r\nhttps://github.com/webtorrent/webtorrent/blob/master/docs/api.md#clientaddtorrentid-opts-function-ontorrent-torrent-\r\n\r\nhttps://www.npmjs.com/package/idb-chunk-store","title":"try chunk store","body":"try indexeddb for chunk store\r\n\r\nhttps://github.com/webtorrent/webtorrent/blob/master/docs/api.md#clientaddtorrentid-opts-function-ontorrent-torrent-\r\n\r\nhttps://www.npmjs.com/package/idb-chunk-store","html":"

try chunk store

\n\n

try indexeddb for chunk store

\n\n

https://github.com/webtorrent/webtorrent/blob/master/docs/api.md#clientaddtorrentid-opts-function-ontorrent-torrent-

\n\n

https://www.npmjs.com/package/idb-chunk-store

\n","meta":{"source":"GitHub","url":"https://github.com/nichoth/clips/issues/2"},"_input_hash":1191057639,"_task_hash":122087704,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# Implement pubsub\n\nIn https://github.com/openzipkin/zipkin-gcp/issues/45, there was a proposal to add pubsub transport (collector and sender), but that never happened.\r\n\r\n@javierviera is currently adding https://github.com/openzipkin/zipkin-go/pull/142 on the golang side, but that's asymmetric and confusing if no server side exists.\r\n\r\nIn any case, the message format should be standard (ListOfSpans in proto or json)\r\n\r\nImplementation wise, sender (java client) should likely use grpc as that's typical. collector should likely use armeria as that's less dependencies and fits into our normal observability tools better (logs metrics tracing) (see stackdriver-storage as an example).\r\n\r\nLooking at the api, it seems there's no grpc endpoint for pubsub, but there's a rest api which likely shares similar auth etc. There seems to be a pull api which could be run in a loop similar to our kafka collectors https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.subscriptions/pull\r\n\r\n@anuraaga have you done any work in pubsub in curiostack?","title":"Implement pubsub","body":"In https://github.com/openzipkin/zipkin-gcp/issues/45, there was a proposal to add pubsub transport (collector and sender), but that never happened.\r\n\r\n@javierviera is currently adding https://github.com/openzipkin/zipkin-go/pull/142 on the golang side, but that's asymmetric and confusing if no server side exists.\r\n\r\nIn any case, the message format should be standard (ListOfSpans in proto or json)\r\n\r\nImplementation wise, sender (java client) should likely use grpc as that's typical. collector should likely use armeria as that's less dependencies and fits into our normal observability tools better (logs metrics tracing) (see stackdriver-storage as an example).\r\n\r\nLooking at the api, it seems there's no grpc endpoint for pubsub, but there's a rest api which likely shares similar auth etc. There seems to be a pull api which could be run in a loop similar to our kafka collectors https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.subscriptions/pull\r\n\r\n@anuraaga have you done any work in pubsub in curiostack?","html":"

Implement pubsub

\n\n

In https://github.com/openzipkin/zipkin-gcp/issues/45, there was a proposal to add pubsub transport (collector and sender), but that never happened.

\n\n

@javierviera is currently adding https://github.com/openzipkin/zipkin-go/pull/142 on the golang side, but that's asymmetric and confusing if no server side exists.

\n\n

In any case, the message format should be standard (ListOfSpans in proto or json)

\n\n

Implementation wise, sender (java client) should likely use grpc as that's typical. collector should likely use armeria as that's less dependencies and fits into our normal observability tools better (logs metrics tracing) (see stackdriver-storage as an example).

\n\n

Looking at the api, it seems there's no grpc endpoint for pubsub, but there's a rest api which likely shares similar auth etc. There seems to be a pull api which could be run in a loop similar to our kafka collectors https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.subscriptions/pull

\n\n

@anuraaga have you done any work in pubsub in curiostack?

\n","meta":{"source":"GitHub","url":"https://github.com/openzipkin/zipkin-gcp/issues/132"},"_input_hash":454222797,"_task_hash":-1800817556,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Create on onClick event","meta":{"source":"GitHub","url":"https://github.com/Gbuomprisco/ngx-chips/issues/503"},"label":"DOCUMENTATION","_input_hash":-563857705,"_task_hash":-1168605487,"answer":"reject"} {"text":"# No longer works with upgraded Vue and Vuex\n\nI am using vuex-persist@2.0.1\r\n\r\nI recently upgraded vue and vuex.\r\nvue@2.6.10\r\nvuex@3.1.0\r\n\r\nBefore the upgrade I had vuex-persist working perfectly, in my typescript project, with the provided documentation.\r\nAfter the upgrade, I now get an error in my Store. Here is my code:\r\n```\r\nconst vuexAllModules = new VuexPersistence({\r\n\tstorage: window.localStorage\r\n});\r\n\r\nexport default new Vuex.Store({\r\n\tstrict: !config.isProd,\r\n\tmodules: {\r\n\t\tmodule1,\r\n\t\tmodule2\r\n\t},\r\n\tstate: {},\r\n\tgetters: {},\r\n\tmutations: {},\r\n\tactions: {},\r\n\tplugins: [vuexAllModules.plugin] // <<<<-----SOURCE OF ERROR\r\n});\r\n```\r\n\r\n**ERROR**\r\n```\r\nType 'Plugin[]' is not assignable to type 'Plugin[]'.\r\n Type 'Plugin' is not assignable to type 'Plugin'.\r\n Types of parameters 'store' and 'store' are incompatible.\r\n Type 'Store' is not assignable to type 'Store'.\r\n Types of property 'registerModule' are incompatible.\r\n Type '{ (path: string, module: Module, options?: ModuleOptions | undefined): void; (path: string[], module: Module, options?: ModuleOptions | undefined): void; }' is not assignable to type '{ (path: string, module: Module, options?: ModuleOptions | undefined): void; (path: string[], module: Module, options?: ModuleOptions | undefined): void; }'.\r\n Types of parameters 'module' and 'module' are incompatible.\r\n Type 'Module' is not assignable to type 'Module'.\r\n Types of property 'actions' are incompatible.\r\n Type 'ActionTree | undefined' is not assignable to type 'ActionTree | undefined'.\r\n Type 'ActionTree' is not assignable to type 'ActionTree'.\r\n Index signatures are incompatible.\r\n Type 'Action' is not assignable to type 'Action'.\r\n Type 'ActionHandler' is not assignable to type 'Action'.\r\n Type 'ActionHandler' is not assignable to type 'ActionHandler'.\r\n Type 'unknown' is not assignable to type 'RootState'.ts(2322)\r\nindex.d.ts(96, 3): The expected type comes from property 'plugins' which is declared here on type 'StoreOptions'\r\n```","title":"No longer works with upgraded Vue and Vuex","body":"I am using vuex-persist@2.0.1\r\n\r\nI recently upgraded vue and vuex.\r\nvue@2.6.10\r\nvuex@3.1.0\r\n\r\nBefore the upgrade I had vuex-persist working perfectly, in my typescript project, with the provided documentation.\r\nAfter the upgrade, I now get an error in my Store. Here is my code:\r\n```\r\nconst vuexAllModules = new VuexPersistence({\r\n\tstorage: window.localStorage\r\n});\r\n\r\nexport default new Vuex.Store({\r\n\tstrict: !config.isProd,\r\n\tmodules: {\r\n\t\tmodule1,\r\n\t\tmodule2\r\n\t},\r\n\tstate: {},\r\n\tgetters: {},\r\n\tmutations: {},\r\n\tactions: {},\r\n\tplugins: [vuexAllModules.plugin] // <<<<-----SOURCE OF ERROR\r\n});\r\n```\r\n\r\n**ERROR**\r\n```\r\nType 'Plugin[]' is not assignable to type 'Plugin[]'.\r\n Type 'Plugin' is not assignable to type 'Plugin'.\r\n Types of parameters 'store' and 'store' are incompatible.\r\n Type 'Store' is not assignable to type 'Store'.\r\n Types of property 'registerModule' are incompatible.\r\n Type '{ (path: string, module: Module, options?: ModuleOptions | undefined): void; (path: string[], module: Module, options?: ModuleOptions | undefined): void; }' is not assignable to type '{ (path: string, module: Module, options?: ModuleOptions | undefined): void; (path: string[], module: Module, options?: ModuleOptions | undefined): void; }'.\r\n Types of parameters 'module' and 'module' are incompatible.\r\n Type 'Module' is not assignable to type 'Module'.\r\n Types of property 'actions' are incompatible.\r\n Type 'ActionTree | undefined' is not assignable to type 'ActionTree | undefined'.\r\n Type 'ActionTree' is not assignable to type 'ActionTree'.\r\n Index signatures are incompatible.\r\n Type 'Action' is not assignable to type 'Action'.\r\n Type 'ActionHandler' is not assignable to type 'Action'.\r\n Type 'ActionHandler' is not assignable to type 'ActionHandler'.\r\n Type 'unknown' is not assignable to type 'RootState'.ts(2322)\r\nindex.d.ts(96, 3): The expected type comes from property 'plugins' which is declared here on type 'StoreOptions'\r\n```","html":"

No longer works with upgraded Vue and Vuex

\n\n

I am using vuex-persist@2.0.1

\n\n

I recently upgraded vue and vuex.\nvue@2.6.10\nvuex@3.1.0

\n\n

Before the upgrade I had vuex-persist working perfectly, in my typescript project, with the provided documentation.\nAfter the upgrade, I now get an error in my Store. Here is my code:\n```\nconst vuexAllModules = new VuexPersistence({\n storage: window.localStorage\n});

\n\n

export default new Vuex.Store({\n strict: !config.isProd,\n modules: {\n module1,\n module2\n },\n state: {},\n getters: {},\n mutations: {},\n actions: {},\n plugins: [vuexAllModules.plugin] // <<<<-----SOURCE OF ERROR\n});\n```

\n\n

ERROR\n\nType 'Plugin<unknown>[]' is not assignable to type 'Plugin<RootState>[]'.\n Type 'Plugin<unknown>' is not assignable to type 'Plugin<RootState>'.\n Types of parameters 'store' and 'store' are incompatible.\n Type 'Store<RootState>' is not assignable to type 'Store<unknown>'.\n Types of property 'registerModule' are incompatible.\n Type '{ <T>(path: string, module: Module<T, RootState>, options?: ModuleOptions | undefined): void; <T>(path: string[], module: Module<T, RootState>, options?: ModuleOptions | undefined): void; }' is not assignable to type '{ <T>(path: string, module: Module<T, unknown>, options?: ModuleOptions | undefined): void; <T>(path: string[], module: Module<T, unknown>, options?: ModuleOptions | undefined): void; }'.\n Types of parameters 'module' and 'module' are incompatible.\n Type 'Module<any, unknown>' is not assignable to type 'Module<any, RootState>'.\n Types of property 'actions' are incompatible.\n Type 'ActionTree<any, unknown> | undefined' is not assignable to type 'ActionTree<any, RootState> | undefined'.\n Type 'ActionTree<any, unknown>' is not assignable to type 'ActionTree<any, RootState>'.\n Index signatures are incompatible.\n Type 'Action<any, unknown>' is not assignable to type 'Action<any, RootState>'.\n Type 'ActionHandler<any, unknown>' is not assignable to type 'Action<any, RootState>'.\n Type 'ActionHandler<any, unknown>' is not assignable to type 'ActionHandler<any, RootState>'.\n Type 'unknown' is not assignable to type 'RootState'.ts(2322)\nindex.d.ts(96, 3): The expected type comes from property 'plugins' which is declared here on type 'StoreOptions<RootState>'\n

\n","meta":{"source":"GitHub","url":"https://github.com/championswimmer/vuex-persist/issues/132"},"_input_hash":-236657404,"_task_hash":-2092288313,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"give some advice on installing Oxcal","meta":{"source":"GitHub","url":"https://github.com/ISAAKiel/oxcAAR/issues/14"},"label":"DOCUMENTATION","_input_hash":-1287883556,"_task_hash":288135206,"answer":"reject"} {"text":"Strange landscapes when m/n > 0.5","meta":{"source":"GitHub","url":"https://github.com/landlab/landlab/issues/515"},"label":"DOCUMENTATION","_input_hash":-128650507,"_task_hash":-670909310,"answer":"reject"} {"text":"Migrate away from wordpress","meta":{"source":"GitHub","url":"https://github.com/AtlassianPS/AtlassianPS/issues/3"},"label":"DOCUMENTATION","_input_hash":-47479988,"_task_hash":-322365294,"answer":"accept"} {"text":"# Section/headers level hierarchy mixup (?) in docs (main index in particular)\n\nI noted some oddities in appearance of `index.rst` - the `Developer Documentation` \"section\" looks somewhat stitched to the end of the `User Documentation` chapter and has no TOC entry. Although it _appears_ to be marked as section according to the Python [devguide](https://devguide.python.org/documenting/#sections) recommendations\r\n> Normally, there are no heading levels assigned to certain characters as the structure is determined from the succession of headings. However, for the Python documentation, here is a suggested convention:\r\n```\r\n# with overline, for parts\r\n* with overline, for chapters\r\n=, for sections\r\n-, for subsections\r\n```\r\nit looks like here the markers are indeed assigned in the order they appear: `# * - =`\r\ni.e. the `====` heading is demoted to level 4 and thus below `:tocdepth: 3`.\r\nI haven't found a way to force the levels to some other order if the first intended `subsection` entry just happens to occur before the first `section`...\r\nMight make more sense here to mark `Developer Documentation` as a chapter, on the same level as `Project details`.","title":"Section/headers level hierarchy mixup (?) in docs (main index in particular)","body":"I noted some oddities in appearance of `index.rst` - the `Developer Documentation` \"section\" looks somewhat stitched to the end of the `User Documentation` chapter and has no TOC entry. Although it _appears_ to be marked as section according to the Python [devguide](https://devguide.python.org/documenting/#sections) recommendations\r\n> Normally, there are no heading levels assigned to certain characters as the structure is determined from the succession of headings. However, for the Python documentation, here is a suggested convention:\r\n```\r\n# with overline, for parts\r\n* with overline, for chapters\r\n=, for sections\r\n-, for subsections\r\n```\r\nit looks like here the markers are indeed assigned in the order they appear: `# * - =`\r\ni.e. the `====` heading is demoted to level 4 and thus below `:tocdepth: 3`.\r\nI haven't found a way to force the levels to some other order if the first intended `subsection` entry just happens to occur before the first `section`...\r\nMight make more sense here to mark `Developer Documentation` as a chapter, on the same level as `Project details`.","html":"

Section/headers level hierarchy mixup (?) in docs (main index in particular)

\n\n

I noted some oddities in appearance of index.rst - the Developer Documentation \"section\" looks somewhat stitched to the end of the User Documentation chapter and has no TOC entry. Although it appears to be marked as section according to the Python devguide recommendations

\n\n
\n

Normally, there are no heading levels assigned to certain characters as the structure is determined from the succession of headings. However, for the Python documentation, here is a suggested convention:\n ```

\n \n

with overline, for parts

\n
\n\n
    \n
  • with overline, for chapters\n=, for sections\n-, for subsections\n``\nit looks like here the markers are indeed assigned in the order they appear:# * - =\ni.e. the====heading is demoted to level 4 and thus below:tocdepth: 3.\nI haven't found a way to force the levels to some other order if the first intendedsubsectionentry just happens to occur before the firstsection...\nMight make more sense here to markDeveloper Documentationas a chapter, on the same level asProject details`.
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/astropy/astropy/issues/9112"},"_input_hash":-1535813697,"_task_hash":-1709665050,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# DATABENDERS - MIND PALACE \n\n**Before you start, please follow this format for your issue title**: \r\nTEAM NAME - PROJECT NAME\r\n\r\n## \u2139\ufe0f Project information\r\n_Please complete all applicable._\r\n\r\n- **Project Name**: MIND PALACE\r\n- **Short Project Description**: ALL IN ONE TOOLKIT FOR PEOPLE SUFFERING FROM MENTAL HEALTH ISSUES\r\n- **Team Name**: DATABENDERS\r\n- **Team Members**: PARTH SHARMA (https://github.com/Mr-Parth) , TUSHAR_ANCHLIYA (https://github.com/anchliyatushar), NANDINI RATHORE (https://github.com/nandini035)\r\n- **Demo Link**: https://drive.google.com/open?id=19wq-i6dq1EiTqT_M2Dd3zPYDcX3MA7RK\r\n- **Repository Link**: https://github.com/Mr-Parth/CFT-Hack_api\r\n- **Labels**: Blockchain\r\n\r\n## \ud83d\udd25 Your Pitch\r\nWe are developing an ALL-IN-ONE platform for people suffering from mental health issues. The prototype is technically divided in three parts :- Android App (UI), Backend (NodeJS), Truffle App (Ethereum | VueJs | Web3).\r\nWe bridge the gap between Therapists and Normal Users, enabling to access virtual support groups and have a sacred experience to deal with all negativity.\r\nOur Revenue Model is based on :- Advertisements, Freemium, Commission\r\nKEY POINTS IN OUR PRODUCT :- Therapists, Support Groups, Blockchain, ALL-IN-ONE kind of intents, Blogs website/community \r\n\r\n## \ud83d\udd26 Any other specific thing you want to highlight?\r\nKindly go through README :)\r\n\r\n## \u2705 Checklist\r\n\r\n**Before you post the issue**:\r\n- [\u2705 ] You have followed the issue title format.\r\n- [\u2705 ] You have mentioned the correct labels.\r\n- [\u2705 ] You have provided all the information correctly.\r\n- [\u2705 ] You have uploaded the pitch deck to the given Google Drive\r\n","title":"DATABENDERS - MIND PALACE ","body":"**Before you start, please follow this format for your issue title**: \r\nTEAM NAME - PROJECT NAME\r\n\r\n## \u2139\ufe0f Project information\r\n_Please complete all applicable._\r\n\r\n- **Project Name**: MIND PALACE\r\n- **Short Project Description**: ALL IN ONE TOOLKIT FOR PEOPLE SUFFERING FROM MENTAL HEALTH ISSUES\r\n- **Team Name**: DATABENDERS\r\n- **Team Members**: PARTH SHARMA (https://github.com/Mr-Parth) , TUSHAR_ANCHLIYA (https://github.com/anchliyatushar), NANDINI RATHORE (https://github.com/nandini035)\r\n- **Demo Link**: https://drive.google.com/open?id=19wq-i6dq1EiTqT_M2Dd3zPYDcX3MA7RK\r\n- **Repository Link**: https://github.com/Mr-Parth/CFT-Hack_api\r\n- **Labels**: Blockchain\r\n\r\n## \ud83d\udd25 Your Pitch\r\nWe are developing an ALL-IN-ONE platform for people suffering from mental health issues. The prototype is technically divided in three parts :- Android App (UI), Backend (NodeJS), Truffle App (Ethereum | VueJs | Web3).\r\nWe bridge the gap between Therapists and Normal Users, enabling to access virtual support groups and have a sacred experience to deal with all negativity.\r\nOur Revenue Model is based on :- Advertisements, Freemium, Commission\r\nKEY POINTS IN OUR PRODUCT :- Therapists, Support Groups, Blockchain, ALL-IN-ONE kind of intents, Blogs website/community \r\n\r\n## \ud83d\udd26 Any other specific thing you want to highlight?\r\nKindly go through README :)\r\n\r\n## \u2705 Checklist\r\n\r\n**Before you post the issue**:\r\n- [\u2705 ] You have followed the issue title format.\r\n- [\u2705 ] You have mentioned the correct labels.\r\n- [\u2705 ] You have provided all the information correctly.\r\n- [\u2705 ] You have uploaded the pitch deck to the given Google Drive\r\n","html":"

DATABENDERS - MIND PALACE

\n\n

Before you start, please follow this format for your issue title:
\nTEAM NAME - PROJECT NAME

\n\n

\u2139\ufe0f Project information

\n\n

Please complete all applicable.

\n\n
    \n
  • Project Name: MIND PALACE
  • \n
  • Short Project Description: ALL IN ONE TOOLKIT FOR PEOPLE SUFFERING FROM MENTAL HEALTH ISSUES
  • \n
  • Team Name: DATABENDERS
  • \n
  • Team Members: PARTH SHARMA (https://github.com/Mr-Parth) , TUSHAR_ANCHLIYA (https://github.com/anchliyatushar), NANDINI RATHORE (https://github.com/nandini035)
  • \n
  • Demo Link: https://drive.google.com/open?id=19wq-i6dq1EiTqT_M2Dd3zPYDcX3MA7RK
  • \n
  • Repository Link: https://github.com/Mr-Parth/CFT-Hack_api
  • \n
  • Labels: Blockchain
  • \n
\n\n

\ud83d\udd25 Your Pitch

\n\n

We are developing an ALL-IN-ONE platform for people suffering from mental health issues. The prototype is technically divided in three parts :- Android App (UI), Backend (NodeJS), Truffle App (Ethereum | VueJs | Web3).\nWe bridge the gap between Therapists and Normal Users, enabling to access virtual support groups and have a sacred experience to deal with all negativity.\nOur Revenue Model is based on :- Advertisements, Freemium, Commission\nKEY POINTS IN OUR PRODUCT :- Therapists, Support Groups, Blockchain, ALL-IN-ONE kind of intents, Blogs website/community

\n\n

\ud83d\udd26 Any other specific thing you want to highlight?

\n\n

Kindly go through README :)

\n\n

\u2705 Checklist

\n\n

Before you post the issue:\n- [\u2705 ] You have followed the issue title format.\n- [\u2705 ] You have mentioned the correct labels.\n- [\u2705 ] You have provided all the information correctly.\n- [\u2705 ] You have uploaded the pitch deck to the given Google Drive

\n","meta":{"source":"GitHub","url":"https://github.com/cft-hacks/submissions/issues/11"},"_input_hash":1529620750,"_task_hash":-667860000,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# The system cannot find the path specified. error Command failed with exit code 1.\n\n$ yarn bootstrap yarn run v1.13.0 $ yarn && ocular-bootstrap warning package-lock.json found. Your project contains lock files generated by tools other than Yarn. It is advised not to mix package managers in order to avoid resolution inconsistencies cau sed by unsynchronized lock files. To clear this warning, remove package-lock.json. [1/5] Validating package.json... [2/5] Resolving packages... [3/5] Fetching packages... info fsevents@1.2.9: The platform \"win32\" is incompatible with this module. info \"fsevents@1.2.9\" is an optional dependency and failed compatibility check. Excluding it from installation. [4/5] Linking dependencies... warning \" > @babel/plugin-proposal-class-properties@7.4.4\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/plugin-proposal-class-properties > @babel/helper-create-class-features-plugin@7.4.4\" has unmet peer dependency \"@babel/core@^7.0.0\". warning \" > @babel/preset-react@7.0.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-display-name@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx@7.3.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx-self@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx-source@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx > @babel/plugin-syntax-jsx@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \" > @babel/register@7.4.4\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \" > @deck.gl/test-utils@7.1.10\" has unmet peer dependency \"@deck.gl/core@^7.0.0\". warning \" > babel-loader@8.0.5\" has unmet peer dependency \"@babel/core@^7.0.0\". warning \" > babel-loader@8.0.5\" has unmet peer dependency \"webpack@>=2\". warning \" > eslint-config-uber-jsx@3.3.3\" has unmet peer dependency \"eslint@>= 3.0.0 < 5\". warning \"eslint-config-uber-jsx > eslint-config-uber-es5@2.0.3\" has unmet peer dependency \"eslint@>= 3.0.0 < 5\". warning \"eslint-config-uber-jsx > eslint-plugin-react@6.10.3\" has unmet peer dependency \"eslint@^2.0.0 \\|\\| ^3.0.0\". warning \" > eslint-plugin-react@7.12.4\" has unmet peer dependency \"eslint@^3.0.0 \\|\\| ^4.0.0 \\|\\| ^5.0.0\". warning \" > ocular-dev-tools@0.0.27\" has incorrect peer dependency \"@probe.gl/test-utils@^3.0.2\". warning \" > @streetscape.gl/layers@1.0.0-beta.16\" has unmet peer dependency \"@deck.gl/core@^7.1.2\". warning \" > @streetscape.gl/layers@1.0.0-beta.16\" has unmet peer dependency \"@deck.gl/layers@^7.1.2\". [5/5] Building fresh packages... The system cannot find the path specified. error Command failed with exit code 1. info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.\r\n--\r\n\r\n\r\nWould you help me to figure it out the issue.","title":"The system cannot find the path specified. error Command failed with exit code 1.","body":"$ yarn bootstrap yarn run v1.13.0 $ yarn && ocular-bootstrap warning package-lock.json found. Your project contains lock files generated by tools other than Yarn. It is advised not to mix package managers in order to avoid resolution inconsistencies cau sed by unsynchronized lock files. To clear this warning, remove package-lock.json. [1/5] Validating package.json... [2/5] Resolving packages... [3/5] Fetching packages... info fsevents@1.2.9: The platform \"win32\" is incompatible with this module. info \"fsevents@1.2.9\" is an optional dependency and failed compatibility check. Excluding it from installation. [4/5] Linking dependencies... warning \" > @babel/plugin-proposal-class-properties@7.4.4\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/plugin-proposal-class-properties > @babel/helper-create-class-features-plugin@7.4.4\" has unmet peer dependency \"@babel/core@^7.0.0\". warning \" > @babel/preset-react@7.0.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-display-name@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx@7.3.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx-self@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx-source@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx > @babel/plugin-syntax-jsx@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \" > @babel/register@7.4.4\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \" > @deck.gl/test-utils@7.1.10\" has unmet peer dependency \"@deck.gl/core@^7.0.0\". warning \" > babel-loader@8.0.5\" has unmet peer dependency \"@babel/core@^7.0.0\". warning \" > babel-loader@8.0.5\" has unmet peer dependency \"webpack@>=2\". warning \" > eslint-config-uber-jsx@3.3.3\" has unmet peer dependency \"eslint@>= 3.0.0 < 5\". warning \"eslint-config-uber-jsx > eslint-config-uber-es5@2.0.3\" has unmet peer dependency \"eslint@>= 3.0.0 < 5\". warning \"eslint-config-uber-jsx > eslint-plugin-react@6.10.3\" has unmet peer dependency \"eslint@^2.0.0 \\|\\| ^3.0.0\". warning \" > eslint-plugin-react@7.12.4\" has unmet peer dependency \"eslint@^3.0.0 \\|\\| ^4.0.0 \\|\\| ^5.0.0\". warning \" > ocular-dev-tools@0.0.27\" has incorrect peer dependency \"@probe.gl/test-utils@^3.0.2\". warning \" > @streetscape.gl/layers@1.0.0-beta.16\" has unmet peer dependency \"@deck.gl/core@^7.1.2\". warning \" > @streetscape.gl/layers@1.0.0-beta.16\" has unmet peer dependency \"@deck.gl/layers@^7.1.2\". [5/5] Building fresh packages... The system cannot find the path specified. error Command failed with exit code 1. info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.\r\n--\r\n\r\n\r\nWould you help me to figure it out the issue.","html":"

The system cannot find the path specified. error Command failed with exit code 1.

\n\n

$ yarn bootstrap yarn run v1.13.0 $ yarn && ocular-bootstrap warning package-lock.json found. Your project contains lock files generated by tools other than Yarn. It is advised not to mix package managers in order to avoid resolution inconsistencies cau sed by unsynchronized lock files. To clear this warning, remove package-lock.json. [1/5] Validating package.json... [2/5] Resolving packages... [3/5] Fetching packages... info fsevents@1.2.9: The platform \"win32\" is incompatible with this module. info \"fsevents@1.2.9\" is an optional dependency and failed compatibility check. Excluding it from installation. [4/5] Linking dependencies... warning \" > @babel/plugin-proposal-class-properties@7.4.4\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/plugin-proposal-class-properties > @babel/helper-create-class-features-plugin@7.4.4\" has unmet peer dependency \"@babel/core@^7.0.0\". warning \" > @babel/preset-react@7.0.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-display-name@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx@7.3.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx-self@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx-source@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \"@babel/preset-react > @babel/plugin-transform-react-jsx > @babel/plugin-syntax-jsx@7.2.0\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \" > @babel/register@7.4.4\" has unmet peer dependency \"@babel/core@^7.0.0-0\". warning \" > @deck.gl/test-utils@7.1.10\" has unmet peer dependency \"@deck.gl/core@^7.0.0\". warning \" > babel-loader@8.0.5\" has unmet peer dependency \"@babel/core@^7.0.0\". warning \" > babel-loader@8.0.5\" has unmet peer dependency \"webpack@>=2\". warning \" > eslint-config-uber-jsx@3.3.3\" has unmet peer dependency \"eslint@>= 3.0.0 < 5\". warning \"eslint-config-uber-jsx > eslint-config-uber-es5@2.0.3\" has unmet peer dependency \"eslint@>= 3.0.0 < 5\". warning \"eslint-config-uber-jsx > eslint-plugin-react@6.10.3\" has unmet peer dependency \"eslint@^2.0.0 \\|\\| ^3.0.0\". warning \" > eslint-plugin-react@7.12.4\" has unmet peer dependency \"eslint@^3.0.0 \\|\\| ^4.0.0 \\|\\| ^5.0.0\". warning \" > ocular-dev-tools@0.0.27\" has incorrect peer dependency \"@probe.gl/test-utils@^3.0.2\". warning \" > @streetscape.gl/layers@1.0.0-beta.16\" has unmet peer dependency \"@deck.gl/core@^7.1.2\". warning \" > @streetscape.gl/layers@1.0.0-beta.16\" has unmet peer dependency \"@deck.gl/layers@^7.1.2\". [5/5] Building fresh packages... The system cannot find the path specified. error Command failed with exit code 1. info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.

\n\n

Would you help me to figure it out the issue.

\n","meta":{"source":"GitHub","url":"https://github.com/uber/streetscape.gl/issues/377"},"_input_hash":1194123017,"_task_hash":-1104201513,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Setup CI","meta":{"source":"GitHub","url":"https://github.com/smitthakkar96/ascii_binder_search_plugin/issues/20"},"label":"DOCUMENTATION","_input_hash":-1215509298,"_task_hash":1906545128,"answer":"reject"} {"text":"Release iOS SDK with README","meta":{"source":"GitHub","url":"https://github.com/hasura/support/issues/372"},"label":"DOCUMENTATION","_input_hash":270029648,"_task_hash":536559238,"answer":"accept"} {"text":"Improve docs","meta":{"source":"GitHub","url":"https://github.com/reactjs/react-transition-group/issues/134"},"label":"DOCUMENTATION","_input_hash":-1783060616,"_task_hash":-712768656,"answer":"accept"} {"text":"# Documentation does not show \"ref\" in method signatures\n\n\r\n\r\n\r\n\r\nThe documentation on the website doesn't show the `ref` keyword in method signatures. For example, compare the documentation pages for [this](http://www.monogame.net/documentation/?page=M_Microsoft_Xna_Framework_Input_Joystick_GetState_1) and [this](http://www.monogame.net/documentation/?page=M_Microsoft_Xna_Framework_Rectangle_Intersect) with the actual signatures [here](https://github.com/MonoGame/MonoGame/blob/develop/MonoGame.Framework/Input/Joystick.cs#L57) and [here](https://github.com/MonoGame/MonoGame/blob/develop/MonoGame.Framework/Rectangle.cs#L425).\r\n\r\n\r\n\r\n\r\n\r\n#### What version of MonoGame does the bug occur on:\r\n- Develop\r\n\r\n#### What operating system are you using:\r\n- N/A\r\n\r\n#### What MonoGame platform are you using:\r\n\r\n- N/A\r\n","title":"Documentation does not show \"ref\" in method signatures","body":"\r\n\r\n\r\n\r\nThe documentation on the website doesn't show the `ref` keyword in method signatures. For example, compare the documentation pages for [this](http://www.monogame.net/documentation/?page=M_Microsoft_Xna_Framework_Input_Joystick_GetState_1) and [this](http://www.monogame.net/documentation/?page=M_Microsoft_Xna_Framework_Rectangle_Intersect) with the actual signatures [here](https://github.com/MonoGame/MonoGame/blob/develop/MonoGame.Framework/Input/Joystick.cs#L57) and [here](https://github.com/MonoGame/MonoGame/blob/develop/MonoGame.Framework/Rectangle.cs#L425).\r\n\r\n\r\n\r\n\r\n\r\n#### What version of MonoGame does the bug occur on:\r\n- Develop\r\n\r\n#### What operating system are you using:\r\n- N/A\r\n\r\n#### What MonoGame platform are you using:\r\n\r\n- N/A\r\n","html":"

Documentation does not show \"ref\" in method signatures

\n\n\n\n\n\n

The documentation on the website doesn't show the ref keyword in method signatures. For example, compare the documentation pages for this and this with the actual signatures here and here.

\n\n\n\n

What version of MonoGame does the bug occur on:

\n\n
    \n
  • Develop
  • \n
\n\n

What operating system are you using:

\n\n
    \n
  • N/A
  • \n
\n\n

What MonoGame platform are you using:

\n\n

\n- N/A

\n","meta":{"source":"GitHub","url":"https://github.com/MonoGame/MonoGame/issues/6849"},"_input_hash":-1657210920,"_task_hash":-1256719670,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# out of date function signatures\n\nthe \"WebGLRenderingContext\" parameter is missing from the following docs:\r\n\r\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/dist/4.x/twgl.d.ts#L251\r\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/dist/4.x/twgl.d.ts#L267\r\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/src/programs.js#L828-L838\r\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/src/programs.js#L1393-L1402\r\n","title":"out of date function signatures","body":"the \"WebGLRenderingContext\" parameter is missing from the following docs:\r\n\r\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/dist/4.x/twgl.d.ts#L251\r\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/dist/4.x/twgl.d.ts#L267\r\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/src/programs.js#L828-L838\r\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/src/programs.js#L1393-L1402\r\n","html":"

out of date function signatures

\n\n

the \"WebGLRenderingContext\" parameter is missing from the following docs:

\n\n

https://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/dist/4.x/twgl.d.ts#L251\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/dist/4.x/twgl.d.ts#L267\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/src/programs.js#L828-L838\nhttps://github.com/greggman/twgl.js/blob/a90ce8ef2083fd6185fa30cc8972aafc28cc68d7/src/programs.js#L1393-L1402

\n","meta":{"source":"GitHub","url":"https://github.com/greggman/twgl.js/issues/135"},"_input_hash":-506246362,"_task_hash":1201122890,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"No VHDL nor Verilog standard version specified in readme","meta":{"source":"GitHub","url":"https://github.com/Nic30/hdlConvertor/issues/6"},"label":"DOCUMENTATION","_input_hash":1724073475,"_task_hash":-1967128991,"answer":"accept"} {"text":"Enhance the `options` section in the DOCs","meta":{"source":"GitHub","url":"https://github.com/conan-io/docs/issues/266"},"label":"DOCUMENTATION","_input_hash":1090791044,"_task_hash":-638713962,"answer":"accept"} {"text":"cryptography 2.0.1 segfaults on Ubuntu 12.04","meta":{"source":"GitHub","url":"https://github.com/pyca/cryptography/issues/3824"},"label":"DOCUMENTATION","_input_hash":-169112204,"_task_hash":-1534260817,"answer":"reject"} {"text":"Document math.Log template func","meta":{"source":"GitHub","url":"https://github.com/gohugoio/hugoDocs/issues/99"},"label":"DOCUMENTATION","_input_hash":-443648706,"_task_hash":307483138,"answer":"accept"} {"text":"ResourceList does not render item as per doc","meta":{"source":"GitHub","url":"https://github.com/Shopify/polaris/issues/152"},"label":"DOCUMENTATION","_input_hash":-1018760299,"_task_hash":-478938777,"answer":"reject"} {"text":"# Custom Operations with Docker\n\nI try to add [custom operations](https://docs.nvidia.com/deeplearning/sdk/tensorrt-inference-server-master-branch-guide/docs/custom_operation.html#custom-operations) to a model\r\n\r\nCurrently I run the Server with [docker](https://docs.nvidia.com/deeplearning/sdk/tensorrt-inference-server-master-branch-guide/docs/run.html#running-the-inference-server):\r\n\r\n```\r\n$ nvidia-docker run --rm --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v/path/to/model/repository:/models trtserver --model-store=/models\r\n```\r\nFrom the documentation about custom operations I infer that the combined command looks like:\r\n\r\n```\r\n`$ nvidia-docker run --rm --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v/path/to/model/repository:/models LD_PRELOAD=libtrtcustom.so trtserver --model-store=/models\r\n```\r\ndifference: LD_PRELOAD=libtrtcustom.so added before trtserver\r\n\r\nThe problem is that the server does not start - it does not find the library.\r\nWhere do I have to place the library (in container or outside)? \r\nDoes the approach work with docker or only with manual builds?\r\nHow does the correct command to run the server look like?\r\n\r\n","title":"Custom Operations with Docker","body":"I try to add [custom operations](https://docs.nvidia.com/deeplearning/sdk/tensorrt-inference-server-master-branch-guide/docs/custom_operation.html#custom-operations) to a model\r\n\r\nCurrently I run the Server with [docker](https://docs.nvidia.com/deeplearning/sdk/tensorrt-inference-server-master-branch-guide/docs/run.html#running-the-inference-server):\r\n\r\n```\r\n$ nvidia-docker run --rm --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v/path/to/model/repository:/models trtserver --model-store=/models\r\n```\r\nFrom the documentation about custom operations I infer that the combined command looks like:\r\n\r\n```\r\n`$ nvidia-docker run --rm --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v/path/to/model/repository:/models LD_PRELOAD=libtrtcustom.so trtserver --model-store=/models\r\n```\r\ndifference: LD_PRELOAD=libtrtcustom.so added before trtserver\r\n\r\nThe problem is that the server does not start - it does not find the library.\r\nWhere do I have to place the library (in container or outside)? \r\nDoes the approach work with docker or only with manual builds?\r\nHow does the correct command to run the server look like?\r\n\r\n","html":"

Custom Operations with Docker

\n\n

I try to add custom operations to a model

\n\n

Currently I run the Server with docker:

\n\n

\n$ nvidia-docker run --rm --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v/path/to/model/repository:/models <tensorrtserver image name> trtserver --model-store=/models\n\nFrom the documentation about custom operations I infer that the combined command looks like:

\n\n

\n`$ nvidia-docker run --rm --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v/path/to/model/repository:/models <tensorrtserver image name> LD_PRELOAD=libtrtcustom.so trtserver --model-store=/models\n\ndifference: LD_PRELOAD=libtrtcustom.so added before trtserver

\n\n

The problem is that the server does not start - it does not find the library.\nWhere do I have to place the library (in container or outside)? \nDoes the approach work with docker or only with manual builds?\nHow does the correct command to run the server look like?

\n","meta":{"source":"GitHub","url":"https://github.com/NVIDIA/tensorrt-inference-server/issues/545"},"_input_hash":135059905,"_task_hash":-1247856079,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"# Docs: Add documentation for style overriding for each Field component\n\n","title":"Docs: Add documentation for style overriding for each Field component","body":"","html":"

Docs: Add documentation for style overriding for each Field component

\n","meta":{"source":"GitHub","url":"https://github.com/Sam-Ogden/react-formtype/issues/17"},"_input_hash":-371885405,"_task_hash":2082979117,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"[PRE REVIEW]: Django Remote Submission","meta":{"source":"GitHub","url":"https://github.com/openjournals/joss-reviews/issues/336"},"label":"DOCUMENTATION","_input_hash":754724624,"_task_hash":-1921166607,"answer":"reject"} {"text":"Question","meta":{"source":"GitHub","url":"https://github.com/thunlp/TensorFlow-NRE/issues/24"},"label":"DOCUMENTATION","_input_hash":-515359321,"_task_hash":-302234087,"answer":"reject"} {"text":"# Default title\n\ndefault description\n\n|Property | Value|\n|------------ | -------------|\n| Session ID | 6a50b5012de09d92a86454b06eb92fda3439729d |\n| Status | done |\n| Reason | CLIENT_STOPPED_SESSION |\n| Input Capabilities |
  • **build:** Automate_win_chrome545380
  • **name:** win_chrome_112100
  • **browserstack.queue.retries:** 2
  • **acceptSslCert:** false
  • **detected_language:** selenium/3.141.0 (ruby linux)
  • **browserstack.seleniumLogs:** true
  • **browserstack.appiumLogs:** false
  • **browser_version:** 76.0
|\n| Session URL | https://automate-ci.bsstag.com/builds/2f6273d21fc8803236b36339a52ad5d189b05938/sessions/6a50b5012de09d92a86454b06eb92fda3439729d |\n| Public Session URL | https://automate-ci.bsstag.com/builds/2f6273d21fc8803236b36339a52ad5d189b05938/sessions/6a50b5012de09d92a86454b06eb92fda3439729d?auth_token=5c417f82f12253b97d869886be9282bba500d98c4fc6cd7dd1887312f38e16d4 |\n| Exception Timestamp | 00:04 |\n\n\n**Exception Stacktrace: **no such element: Unable to locate element: {\"method\":\"id\",\"selector\":\"okgoogle\"}\n (Session info: chrome=76.0.3809.87)\n (Driver info: chromedriver=76.0.3809.68 (420c9498db8ce8fcd190a954d51297672c1515d5-refs/branch-heads/3809@{#864}),platform=Mac OS X 10.11.6 x86_64) (WARNING: The server did not provide any stacktrace information)\nCommand duration or timeout: 20 milliseconds\nFor documentation on this error, please visit: http://seleniumhq.org/exceptions/no_such_element.html\nBuild info: version: '2.53.0', revision: '35ae25b', time: '2016-03-15 17:00:58'\nSystem info: host: 'mac-208-52-157-48.browserstack.com', ip: '208.52.157.48', os.name: 'Mac OS X', os.arch: 'x86_64', os.version: '10.11.6', java.version: '1.8.0_51'\nDriver info: org.openqa.selenium.chrome.ChromeDriver\nCapabilities [{mobileEmulationEnabled=false, timeouts={implicit=0, pageLoad=300000, script=30000}, hasTouchScreen=false, platform=MAC, acceptSslCerts=false, goog:chromeOptions={debuggerAddress=localhost:55078}, acceptInsecureCerts=false, webStorageEnabled=true, browserName=chrome, takesScreenshot=true, javascriptEnabled=true, setWindowRect=true, unexpectedAlertBehaviour=ignore, applicationCacheEnabled=false, rotatable=false, networkConnectionEnabled=false, chrome={chromedriverVersion=76.0.3809.68 (420c9498db8ce8fcd190a954d51297672c1515d5-refs/branch-heads/3809@{#864}), userDataDir=/var/folders/3y/zz_w6_s56sl__vcrf3r5bhzr0000hr/T/.com.google.Chrome.0Fpv4v}, takesHeapSnapshot=true, pageLoadStrategy=normal, strictFileInteractability=false, databaseEnabled=false, handlesAlerts=true, version=76.0.3809.87, browserConnectionEnabled=false, proxy={}, nativeEvents=true, locationContextEnabled=true, cssSelectorsEnabled=true}]\nSession ID: d829b31bca8e01fa091db8d5d749aaab\n*** Element info: {Using=id, value=okgoogle}","title":"Default title","body":"default description\n\n|Property | Value|\n|------------ | -------------|\n| Session ID | 6a50b5012de09d92a86454b06eb92fda3439729d |\n| Status | done |\n| Reason | CLIENT_STOPPED_SESSION |\n| Input Capabilities |
  • **build:** Automate_win_chrome545380
  • **name:** win_chrome_112100
  • **browserstack.queue.retries:** 2
  • **acceptSslCert:** false
  • **detected_language:** selenium/3.141.0 (ruby linux)
  • **browserstack.seleniumLogs:** true
  • **browserstack.appiumLogs:** false
  • **browser_version:** 76.0
|\n| Session URL | https://automate-ci.bsstag.com/builds/2f6273d21fc8803236b36339a52ad5d189b05938/sessions/6a50b5012de09d92a86454b06eb92fda3439729d |\n| Public Session URL | https://automate-ci.bsstag.com/builds/2f6273d21fc8803236b36339a52ad5d189b05938/sessions/6a50b5012de09d92a86454b06eb92fda3439729d?auth_token=5c417f82f12253b97d869886be9282bba500d98c4fc6cd7dd1887312f38e16d4 |\n| Exception Timestamp | 00:04 |\n\n\n**Exception Stacktrace: **no such element: Unable to locate element: {\"method\":\"id\",\"selector\":\"okgoogle\"}\n (Session info: chrome=76.0.3809.87)\n (Driver info: chromedriver=76.0.3809.68 (420c9498db8ce8fcd190a954d51297672c1515d5-refs/branch-heads/3809@{#864}),platform=Mac OS X 10.11.6 x86_64) (WARNING: The server did not provide any stacktrace information)\nCommand duration or timeout: 20 milliseconds\nFor documentation on this error, please visit: http://seleniumhq.org/exceptions/no_such_element.html\nBuild info: version: '2.53.0', revision: '35ae25b', time: '2016-03-15 17:00:58'\nSystem info: host: 'mac-208-52-157-48.browserstack.com', ip: '208.52.157.48', os.name: 'Mac OS X', os.arch: 'x86_64', os.version: '10.11.6', java.version: '1.8.0_51'\nDriver info: org.openqa.selenium.chrome.ChromeDriver\nCapabilities [{mobileEmulationEnabled=false, timeouts={implicit=0, pageLoad=300000, script=30000}, hasTouchScreen=false, platform=MAC, acceptSslCerts=false, goog:chromeOptions={debuggerAddress=localhost:55078}, acceptInsecureCerts=false, webStorageEnabled=true, browserName=chrome, takesScreenshot=true, javascriptEnabled=true, setWindowRect=true, unexpectedAlertBehaviour=ignore, applicationCacheEnabled=false, rotatable=false, networkConnectionEnabled=false, chrome={chromedriverVersion=76.0.3809.68 (420c9498db8ce8fcd190a954d51297672c1515d5-refs/branch-heads/3809@{#864}), userDataDir=/var/folders/3y/zz_w6_s56sl__vcrf3r5bhzr0000hr/T/.com.google.Chrome.0Fpv4v}, takesHeapSnapshot=true, pageLoadStrategy=normal, strictFileInteractability=false, databaseEnabled=false, handlesAlerts=true, version=76.0.3809.87, browserConnectionEnabled=false, proxy={}, nativeEvents=true, locationContextEnabled=true, cssSelectorsEnabled=true}]\nSession ID: d829b31bca8e01fa091db8d5d749aaab\n*** Element info: {Using=id, value=okgoogle}","html":"

Default title

\n\n

default description

\n\n

|Property | Value|\n|------------ | -------------|\n| Session ID | 6a50b5012de09d92a86454b06eb92fda3439729d |\n| Status | done |\n| Reason | CLIENTSTOPPEDSESSION |\n| Input Capabilities |

  • build: Automatewinchrome545380
  • name: winchrome112100
  • browserstack.queue.retries: 2
  • acceptSslCert: false
  • detectedlanguage: selenium/3.141.0 (ruby linux)
  • browserstack.seleniumLogs: true
  • browserstack.appiumLogs: false
  • browserversion: 76.0
|\n| Session URL | https://automate-ci.bsstag.com/builds/2f6273d21fc8803236b36339a52ad5d189b05938/sessions/6a50b5012de09d92a86454b06eb92fda3439729d |\n| Public Session URL | https://automate-ci.bsstag.com/builds/2f6273d21fc8803236b36339a52ad5d189b05938/sessions/6a50b5012de09d92a86454b06eb92fda3439729d?auth_token=5c417f82f12253b97d869886be9282bba500d98c4fc6cd7dd1887312f38e16d4 |\n| Exception Timestamp | 00:04 |

\n\n

Exception Stacktrace: **no such element: Unable to locate element: {\"method\":\"id\",\"selector\":\"okgoogle\"}\n (Session info: chrome=76.0.3809.87)\n (Driver info: chromedriver=76.0.3809.68 (420c9498db8ce8fcd190a954d51297672c1515d5-refs/branch-heads/3809@{#864}),platform=Mac OS X 10.11.6 x8664) (WARNING: The server did not provide any stacktrace information)\nCommand duration or timeout: 20 milliseconds\nFor documentation on this error, please visit: http://seleniumhq.org/exceptions/nosuchelement.html\nBuild info: version: '2.53.0', revision: '35ae25b', time: '2016-03-15 17:00:58'\nSystem info: host: 'mac-208-52-157-48.browserstack.com', ip: '208.52.157.48', os.name: 'Mac OS X', os.arch: 'x8664', os.version: '10.11.6', java.version: '1.8.051'\nDriver info: org.openqa.selenium.chrome.ChromeDriver\nCapabilities [{mobileEmulationEnabled=false, timeouts={implicit=0, pageLoad=300000, script=30000}, hasTouchScreen=false, platform=MAC, acceptSslCerts=false, goog:chromeOptions={debuggerAddress=localhost:55078}, acceptInsecureCerts=false, webStorageEnabled=true, browserName=chrome, takesScreenshot=true, javascriptEnabled=true, setWindowRect=true, unexpectedAlertBehaviour=ignore, applicationCacheEnabled=false, rotatable=false, networkConnectionEnabled=false, chrome={chromedriverVersion=76.0.3809.68 (420c9498db8ce8fcd190a954d51297672c1515d5-refs/branch-heads/3809@{#864}), userDataDir=/var/folders/3y/zzw6s56sl_vcrf3r5bhzr0000hr/T/.com.google.Chrome.0Fpv4v}, takesHeapSnapshot=true, pageLoadStrategy=normal, strictFileInteractability=false, databaseEnabled=false, handlesAlerts=true, version=76.0.3809.87, browserConnectionEnabled=false, proxy={}, nativeEvents=true, locationContextEnabled=true, cssSelectorsEnabled=true}]\nSession ID: d829b31bca8e01fa091db8d5d749aaab\n* Element info: {Using=id, value=okgoogle}

\n","meta":{"source":"GitHub","url":"https://github.com/automationbs/testbugreporting/issues/366"},"_input_hash":-1350293656,"_task_hash":-451382904,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"date_added and shelves properties on getSingleShelf results?","meta":{"source":"GitHub","url":"https://github.com/bdickason/node-goodreads/issues/35"},"label":"DOCUMENTATION","_input_hash":-875242303,"_task_hash":986093914,"answer":"reject"} {"text":"Can you please provide more information about users file?","meta":{"source":"GitHub","url":"https://github.com/nrwiersma/docker-afpd/issues/1"},"label":"DOCUMENTATION","_input_hash":-405792887,"_task_hash":18270317,"answer":"accept"} {"text":"Missing node-gyp in manual install instructions","meta":{"source":"GitHub","url":"https://github.com/michaelgrosner/tribeca/issues/157"},"label":"DOCUMENTATION","_input_hash":-1836475642,"_task_hash":-139006186,"answer":"accept"} {"text":"Setting timeout option in Start-AzureRmVm","meta":{"source":"GitHub","url":"https://github.com/Azure/azure-powershell/issues/4374"},"label":"DOCUMENTATION","_input_hash":-444699104,"_task_hash":2052970916,"answer":"reject"} {"text":"# Documentation is lacking\n\n# Bug Report\r\n\r\n## System Information\r\n- Ubuntu 19.04\r\n- v1.8.3.4 built 09/08/2019 13:01\r\n- Mainline\r\n## I confirm:\r\n- [x ] that I have searched for an existing bug report for this issue.\r\n- [ x] that I am using the latest available version of AMP.\r\n- [ x] that my operating system is up-to-date.\r\n\r\n\r\n## Symptoms \r\n\r\nI am trying to attach an existing ADS minecraft instance to a ADS on another server. Based on what I read in the forums (!) this should be possible. I found there appears to be an attach command for ampinstmgr. This command does not help to attach an instance on another server. This is documented nowhere other than in the forums.\r\n\r\nThere is many more examples of this. \r\n\r\nThe only existing documentation is on the git wiki. Especially the ampinstmgr command line page seems usefull but even that lacks basic explanation of options and parameters.\r\n\r\nThere needs to be better documentation!\r\n\r\n## Reproduction\r\n\r\n- Try to find information on running a game on another server: nothing in the documents\r\n- Try to find what is meant with e.g. [Module] [Provision Settings] -> Nowhere to be found \r\n- ampinstmgr help attach did provide some information (basically that I can't use it)","title":"Documentation is lacking","body":"# Bug Report\r\n\r\n## System Information\r\n- Ubuntu 19.04\r\n- v1.8.3.4 built 09/08/2019 13:01\r\n- Mainline\r\n## I confirm:\r\n- [x ] that I have searched for an existing bug report for this issue.\r\n- [ x] that I am using the latest available version of AMP.\r\n- [ x] that my operating system is up-to-date.\r\n\r\n\r\n## Symptoms \r\n\r\nI am trying to attach an existing ADS minecraft instance to a ADS on another server. Based on what I read in the forums (!) this should be possible. I found there appears to be an attach command for ampinstmgr. This command does not help to attach an instance on another server. This is documented nowhere other than in the forums.\r\n\r\nThere is many more examples of this. \r\n\r\nThe only existing documentation is on the git wiki. Especially the ampinstmgr command line page seems usefull but even that lacks basic explanation of options and parameters.\r\n\r\nThere needs to be better documentation!\r\n\r\n## Reproduction\r\n\r\n- Try to find information on running a game on another server: nothing in the documents\r\n- Try to find what is meant with e.g. [Module] [Provision Settings] -> Nowhere to be found \r\n- ampinstmgr help attach did provide some information (basically that I can't use it)","html":"

Documentation is lacking

\n\n

Bug Report

\n\n

System Information

\n\n
    \n
  • Ubuntu 19.04
  • \n
  • v1.8.3.4 built 09/08/2019 13:01
  • \n
  • Mainline

    \n\n

    I confirm:

  • \n
  • [x ] that I have searched for an existing bug report for this issue.

  • \n
  • [ x] that I am using the latest available version of AMP.
  • \n
  • [ x] that my operating system is up-to-date.\n
  • \n
\n\n

Symptoms

\n\n

I am trying to attach an existing ADS minecraft instance to a ADS on another server. Based on what I read in the forums (!) this should be possible. I found there appears to be an attach command for ampinstmgr. This command does not help to attach an instance on another server. This is documented nowhere other than in the forums.

\n\n

There is many more examples of this.

\n\n

The only existing documentation is on the git wiki. Especially the ampinstmgr command line page seems usefull but even that lacks basic explanation of options and parameters.

\n\n

There needs to be better documentation!

\n\n

Reproduction

\n\n
    \n
  • Try to find information on running a game on another server: nothing in the documents
  • \n
  • Try to find what is meant with e.g. [Module] [Provision Settings] -> Nowhere to be found
  • \n
  • ampinstmgr help attach did provide some information (basically that I can't use it)
  • \n
\n","meta":{"source":"GitHub","url":"https://github.com/CubeCoders/AMP/issues/146"},"_input_hash":2061049457,"_task_hash":1174428455,"_view_id":"choice","answer":"accept","label":"DOCUMENTATION"} {"text":"# Swagger documentation of Attribute API is out of sync\n\nIn the documentation for endpoint [/attributes/get_attributes__attribute_id_](https://backendapi.turing.com/docs/#/attributes/get_attributes__attribute_id_) the return is an array of attributes, but when the \"Try out\" is executed only one attribute is returned (no array).","title":"Swagger documentation of Attribute API is out of sync","body":"In the documentation for endpoint [/attributes/get_attributes__attribute_id_](https://backendapi.turing.com/docs/#/attributes/get_attributes__attribute_id_) the return is an array of attributes, but when the \"Try out\" is executed only one attribute is returned (no array).","html":"

Swagger documentation of Attribute API is out of sync

\n\n

In the documentation for endpoint /attributes/getattributesattributeid_ the return is an array of attributes, but when the \"Try out\" is executed only one attribute is returned (no array).

\n","meta":{"source":"GitHub","url":"https://github.com/zandoan/turing-fullstack/issues/7"},"_input_hash":1387789539,"_task_hash":-1645805562,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"npm start gives error","meta":{"source":"GitHub","url":"https://github.com/asadm/urduscript/issues/6"},"label":"DOCUMENTATION","_input_hash":-997553523,"_task_hash":-817945172,"answer":"reject"} {"text":"Remove = from command line options in docs","meta":{"source":"GitHub","url":"https://github.com/BD2KGenomics/toil/issues/1777"},"label":"DOCUMENTATION","_input_hash":-1994483691,"_task_hash":912907018,"answer":"accept"} {"text":"provision Portland K8s cluster","meta":{"source":"GitHub","url":"https://github.com/mozmeao/infra/issues/366"},"label":"DOCUMENTATION","_input_hash":1325331475,"_task_hash":874847273,"answer":"reject"} {"text":"# What is the ultimate end effect of setting bare-metal in the configuration\n\n\r\n\r\n### Description\r\n\r\n\r\nMbed-OS v5.12\r\n\r\nI'm using a custom CMake build process because I have requirements that mbed-cli is unable to accommodate. \r\n\r\nI want to set up a rtos-less build. I've seen issues #7800 and #7794. I have no problems with eliminating the files those two issues cover, but I've also seen a reference to bare-metal in https://os.mbed.com/docs/mbed-os/v5.12/tutorials/migrating-to-mbed-os-5.html\r\n\r\nI would like to know what effect that configuration has - what defines are added or whatnot? I tried grepping for bare-metal in the mbed-cli and tools directory but came up empty. Is there a central source for the mapping of the directives in the json files to defines/files/etc that are added to the build process?\r\n\r\n### Issue request type\r\n\r\n\r\n [X] Question\r\n [ ] Enhancement\r\n [ ] Bug\r\n\r\n","title":"What is the ultimate end effect of setting bare-metal in the configuration","body":"\r\n\r\n### Description\r\n\r\n\r\nMbed-OS v5.12\r\n\r\nI'm using a custom CMake build process because I have requirements that mbed-cli is unable to accommodate. \r\n\r\nI want to set up a rtos-less build. I've seen issues #7800 and #7794. I have no problems with eliminating the files those two issues cover, but I've also seen a reference to bare-metal in https://os.mbed.com/docs/mbed-os/v5.12/tutorials/migrating-to-mbed-os-5.html\r\n\r\nI would like to know what effect that configuration has - what defines are added or whatnot? I tried grepping for bare-metal in the mbed-cli and tools directory but came up empty. Is there a central source for the mapping of the directives in the json files to defines/files/etc that are added to the build process?\r\n\r\n### Issue request type\r\n\r\n\r\n [X] Question\r\n [ ] Enhancement\r\n [ ] Bug\r\n\r\n","html":"

What is the ultimate end effect of setting bare-metal in the configuration

\n\n\n\n

Description

\n\n

\nMbed-OS v5.12

\n\n

I'm using a custom CMake build process because I have requirements that mbed-cli is unable to accommodate.

\n\n

I want to set up a rtos-less build. I've seen issues #7800 and #7794. I have no problems with eliminating the files those two issues cover, but I've also seen a reference to bare-metal in https://os.mbed.com/docs/mbed-os/v5.12/tutorials/migrating-to-mbed-os-5.html

\n\n

I would like to know what effect that configuration has - what defines are added or whatnot? I tried grepping for bare-metal in the mbed-cli and tools directory but came up empty. Is there a central source for the mapping of the directives in the json files to defines/files/etc that are added to the build process?

\n\n

Issue request type

\n\n

\n [X] Question\n [ ] Enhancement\n [ ] Bug

\n","meta":{"source":"GitHub","url":"https://github.com/ARMmbed/mbed-os/issues/11197"},"_input_hash":2006042802,"_task_hash":912897986,"_view_id":"choice","answer":"reject","label":"DOCUMENTATION"} {"text":"Something is missing","meta":{"source":"GitHub","url":"https://github.com/DaveWM/reagent-material-ui/issues/8"},"label":"DOCUMENTATION","_input_hash":857671614,"_task_hash":1830785773,"answer":"reject"} {"text":"Docs wrong for ionSwipe event on expandable ItemSliding","meta":{"source":"GitHub","url":"https://github.com/ionic-team/ionic-site/issues/1213"},"label":"DOCUMENTATION","_input_hash":1652418808,"_task_hash":-228858447,"answer":"accept"} {"text":"Can't start minishift due to rate limit","meta":{"source":"GitHub","url":"https://github.com/minishift/minishift/issues/1184"},"label":"DOCUMENTATION","_input_hash":1309257531,"_task_hash":-1237951649,"answer":"reject"} {"text":"Consider better view for endpoint selection","meta":{"source":"GitHub","url":"https://github.com/Vrong/ovh_mail_redirections_manager_for_android/issues/2"},"label":"DOCUMENTATION","_input_hash":793359335,"_task_hash":-2035539582,"answer":"reject"} {"text":"runAll does not work on a default Fedora installation","meta":{"source":"GitHub","url":"https://github.com/lagom/lagom/issues/902"},"label":"DOCUMENTATION","_input_hash":1966910878,"_task_hash":-1221430670,"answer":"reject"} {"text":"Some click events are not dispatched in WebWorker mode","meta":{"source":"GitHub","url":"https://github.com/angular/angular/issues/18342"},"label":"DOCUMENTATION","_input_hash":1413445527,"_task_hash":-389212382,"answer":"reject"} {"text":"Sequencing Sagas via yield*","meta":{"source":"GitHub","url":"https://github.com/redux-saga/redux-saga/issues/1111"},"label":"DOCUMENTATION","_input_hash":-1772965369,"_task_hash":-1765428732,"answer":"reject"}