Pass in -ci and -test to CI #2893
Conversation
|
It looks like the CI systems are not able to locate xunit in their nuget cache. Not sure how to fix this? |
|
How are you seeing the logs? I don't see the test logs being captured anywhere. what is the error? |
|
He reproed locally. We're looking at it offline. I need to disconnect now for a bit but I'll repro locally and figure it out later. |
We should be capturing the logs as well, so that you don't need to rely on reproing it locally to figure out the issue. |
We do actually. It's here: https://dev.azure.com/dnceng/public/_build/results?buildId=573155&view=artifacts&type=publishedArtifacts That only reads "Path could not be found" but doesn't specify which exe is missing. I had to repro locally to figure out that it was xunit |
|
We also have this issue: #2708 I'll take care of it once we figure out this issue. |
| @@ -0,0 +1,22 @@ | |||
| <RuleSet Name="Tests ruleset" Description="All Rules are disabled." ToolsVersion="15.0"> | |||
pgovind
Mar 26, 2020
Author
Member
This is something that xunit knows to interpret/use?
This is something that xunit knows to interpret/use?
safern
Mar 26, 2020
Member
No, this is something Roslyn understands.
The newest version of xunit brings in analyzers as well. This is the way to configure roslyn analyzers. Then you can set the severity to error, warning or none, on a per rule, per analyzer basis. So whenever we want to enable these rules, we will have to fix the errors we get by the analyzer.
No, this is something Roslyn understands.
The newest version of xunit brings in analyzers as well. This is the way to configure roslyn analyzers. Then you can set the severity to error, warning or none, on a per rule, per analyzer basis. So whenever we want to enable these rules, we will have to fix the errors we get by the analyzer.
safern
Mar 26, 2020
Member
It was just a smaller change to do this and then we can enable the rules by chunks.
It was just a smaller change to do this and then we can enable the rules by chunks.
pgovind
Mar 26, 2020
Author
Member
I see. So when we want to turn on one of these rules, we change Action="None" -> say Action="Error"(or Warning)?
I see. So when we want to turn on one of these rules, we change Action="None" -> say Action="Error"(or Warning)?
safern
Mar 26, 2020
Member
Yeah, or if you don't list them in this file they will take whatever default Action the analyzer defines: http://xunit.net/xunit.analyzers/rules/
Yeah, or if you don't list them in this file they will take whatever default Action the analyzer defines: http://xunit.net/xunit.analyzers/rules/
|
Ok so I found the reason why tests were failing. The reason is because we now use arcade So what I did is remove the version we were using and use arcade's version. However, that pulls in the |
| @@ -41,8 +41,6 @@ public void CtorSpanOverByteArrayValidCasesWithPropertiesAndBasicOperationsCheck | |||
| Span<byte> span = new Span<byte>(array); | |||
| Assert.Equal(array.Length, span.Length); | |||
|
|
|||
| Assert.NotSame(array, span.ToArray()); | |||
safern
Mar 26, 2020
Member
It seems like in 3.0 this behavior changed and span.ToArray returns the same, or the test was wrong.
It seems like in 3.0 this behavior changed and span.ToArray returns the same, or the test was wrong.
| @@ -1,6 +1,6 @@ | |||
| <Project Sdk="Microsoft.NET.Sdk" ToolsVersion="15.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> | |||
| <PropertyGroup> | |||
| <TargetFrameworks>netcoreapp2.0;netcoreapp3.0</TargetFrameworks> | |||
safern
Mar 26, 2020
Member
We don't install any runtimes when bootstrapping, which we could by adding an entry to global.json, however, I don't think it makes sense to test against an older framework.
We don't install any runtimes when bootstrapping, which we could by adding an entry to global.json, however, I don't think it makes sense to test against an older framework.
|
Looks fine to me. Nice work everyone. |
7e4a48c
into
dotnet:master

Formed in 2009, the Archive Team (not to be confused with the archive.org Archive-It Team) is a rogue archivist collective dedicated to saving copies of rapidly dying or deleted websites for the sake of history and digital heritage. The group is 100% composed of volunteers and interested parties, and has expanded into a large amount of related projects for saving online and digital history.

I think this is all we needed. Putting this up to test what happens in a public build