bors [Fri, 5 Oct 2018 14:58:07 +0000 (14:58 +0000)]
Auto merge of #6132 - Xanewok:bump-libgit2-sys, r=Eh2406
Bump libgit2-sys to 0.7.9
For some reason I couldn't compile with newest nightly `rustc 1.31.0-nightly (5597ee8a6 2018-10-03)`:
```
error: failed to run custom build command for `libgit2-sys v0.7.7`
process didn't exit successfully: `/home/xanewok/repos/cargo/target/debug/build/libgit2-sys-f38ab3eb27549370/build-script-build` (exit code: 101)
--- stdout
running: "cmake" "/home/xanewok/.cargo/registry/src/github.com-1ecc6299db9ec823/libgit2-sys-0.7.7/libgit2" "-DGIT_SSH_MEMORY_CREDENTIALS=1" "-DBUILD_SHARED_LIBS=OFF" "-DBUILD_CLAR=OFF" "-DCMAKE_INSTALL_PREFIX=/home/xanewok/repos/cargo/target/debug/build/libgit2-sys-e96505b09ca81ecd/out" "-DCMAKE_C_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_C_COMPILER=/usr/bin/cc" "-DCMAKE_CXX_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_CXX_COMPILER=/usr/bin/c++" "-DCMAKE_BUILD_TYPE=Debug"
-- The C compiler identification is GNU 7.3.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Found PkgConfig: /usr/bin/pkg-config (found version "0.29.1")
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - not found
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE
-- Performing Test HAVE_STRUCT_STAT_ST_MTIM
-- Performing Test HAVE_STRUCT_STAT_ST_MTIM - Success
-- Performing Test HAVE_STRUCT_STAT_ST_MTIMESPEC
-- Performing Test HAVE_STRUCT_STAT_ST_MTIMESPEC - Failed
-- Performing Test HAVE_STRUCT_STAT_MTIME_NSEC
-- Performing Test HAVE_STRUCT_STAT_MTIME_NSEC - Failed
-- Performing Test HAVE_STRUCT_STAT_NSEC
-- Performing Test HAVE_STRUCT_STAT_NSEC - Success
-- Performing Test IS_WALL_SUPPORTED
-- Performing Test IS_WALL_SUPPORTED - Success
-- Performing Test IS_WEXTRA_SUPPORTED
-- Performing Test IS_WEXTRA_SUPPORTED - Success
-- Performing Test IS_WDOCUMENTATION_SUPPORTED
-- Performing Test IS_WDOCUMENTATION_SUPPORTED - Failed
-- Performing Test IS_WNO-MISSING-FIELD-INITIALIZERS_SUPPORTED
-- Performing Test IS_WNO-MISSING-FIELD-INITIALIZERS_SUPPORTED - Success
-- Performing Test IS_WSTRICT-ALIASING_SUPPORTED
-- Performing Test IS_WSTRICT-ALIASING_SUPPORTED - Success
-- Performing Test IS_WSTRICT-PROTOTYPES_SUPPORTED
-- Performing Test IS_WSTRICT-PROTOTYPES_SUPPORTED - Success
-- Performing Test IS_WDECLARATION-AFTER-STATEMENT_SUPPORTED
-- Performing Test IS_WDECLARATION-AFTER-STATEMENT_SUPPORTED - Success
-- Performing Test IS_WSHIFT-COUNT-OVERFLOW_SUPPORTED
-- Performing Test IS_WSHIFT-COUNT-OVERFLOW_SUPPORTED - Success
-- Performing Test IS_WNO-UNUSED-CONST-VARIABLE_SUPPORTED
-- Performing Test IS_WNO-UNUSED-CONST-VARIABLE_SUPPORTED - Success
-- Performing Test IS_WNO-UNUSED-FUNCTION_SUPPORTED
-- Performing Test IS_WNO-UNUSED-FUNCTION_SUPPORTED - Success
-- Looking for regcomp_l
-- Looking for regcomp_l - not found
-- Looking for futimens
-- Looking for futimens - found
-- Looking for qsort_r
-- Looking for qsort_r - found
-- Looking for qsort_s
-- Looking for qsort_s - not found
-- Looking for clock_gettime in rt
-- Looking for clock_gettime in rt - found
-- Checking for module 'libcurl'
-- Found libcurl, version 7.61.1
-- Configuring incomplete, errors occurred!
See also "/home/xanewok/repos/cargo/target/debug/build/libgit2-sys-e96505b09ca81ecd/out/build/CMakeFiles/CMakeOutput.log".
See also "/home/xanewok/repos/cargo/target/debug/build/libgit2-sys-e96505b09ca81ecd/out/build/CMakeFiles/CMakeError.log".
--- stderr
fatal: not a git repository (or any of the parent directories): .git
CMake Error at cmake/Modules/FindPkgLibraries.cmake:17 (MESSAGE):
could not resolve curl
Call Stack (most recent call first):
src/CMakeLists.txt:120 (FIND_PKGLIBRARIES)
thread 'main' panicked at '
command did not execute successfully, got: exit code: 1
build script failed, must exit now', /home/xanewok/.cargo/registry/src/github.com-1ecc6299db9ec823/cmake-0.1.29/src/lib.rs:632:5
note: Run with `RUST_BACKTRACE=1` for a backtrace.
```
on Xubuntu 18.04.1.
Simply bumping `libgit2-sys` seemed to have fixed the problem, so... :man_shrugging:
bors [Fri, 5 Oct 2018 03:49:06 +0000 (03:49 +0000)]
Auto merge of #6134 - Eh2406:dead-code, r=alexcrichton
remove dead code, use impl Iterator instead of custom types, cargo fmt
This is a mishmash of things. `Graph::sort` was unused so I removed it. There are a bunch of custom types that allow us to return iterators that can now be replaced with `impl Iterator`. jetbrains wanted to reorder some impls to match the definition. fmt also made some changes to the files I opened.
bors [Wed, 3 Oct 2018 02:54:55 +0000 (02:54 +0000)]
Auto merge of #6112 - alexcrichton:better-errors, r=Eh2406
Try to improve "version not found" error
This commit tweaks the wording for when versions aren't found in a
source, notably including more location information other than the
source id but rather also taking into account replaced sources and such.
It's hoped that this gives more information to make it a bit more clear
what's going on.
bors [Wed, 3 Oct 2018 02:11:37 +0000 (02:11 +0000)]
Auto merge of #6122 - ehuss:msys-width, r=alexcrichton
Second attempt at fixing msys terminal width.
Lock the max width on msys-based terminals to 60. I tried a lot of different
things, but I was unable to find a way to detect the correct width in mintty.
Unfortunately this means that terminals that work correctly like ConEmu will
also be capped at 60. C'est la vie.
Of course this does not affect cmd, powershell, etc.
Eric Huss [Wed, 3 Oct 2018 00:20:33 +0000 (17:20 -0700)]
Second attempt at fixing msys terminal width.
Lock the max width on msys-based terminals to 60. I tried a lot of different
things, but I was unable to find a way to detect the correct width in mintty.
Unfortunately this means that terminals that work correctly like ConEmu will
also be capped at 60. C'est la vie.
Of course this does not affect cmd, powershell, etc.
Alex Crichton [Sat, 29 Sep 2018 21:37:03 +0000 (14:37 -0700)]
Try to improve "version not found" error
This commit tweaks the wording for when versions aren't found in a
source, notably including more location information other than the
source id but rather also taking into account replaced sources and such.
It's hoped that this gives more information to make it a bit more clear
what's going on.
Auto merge of #6103 - Eh2406:proptest, r=alexcrichton
Proptest/Resolver move test helper functions to support
This moves all the code in `tests/testsuite/resolve.rs` that is not tests of cargo to ` tests/testsuite/support/resolve.rs`. (follow up to #5921)
This also clears up some inconsistencies in naming between local variables in `activate_deps_loop` and `BacktrackFrame`. (follow up to #6097) This is a true refactoring, nothing about the executable has changed.
Auto merge of #6097 - Eh2406:refactoring, r=alexcrichton
small refactor
This moves the visual indicator code out of the most complicated loop in the resolver. The `activate_deps_loop` is the heart of the resolver, with all its moving pieces in play at the same time. It is completely overwhelming. This is a true refactoring, nothing about the executable has changed, but a screen full of code/comments got moved to a helper type in a different file.
I'd love to see more moved out of `activate_deps_loop` but this is the thing that came most cleanly.
edit: I was also able to move the `pop_most_constrained` code out as well. So that is now also in this PR.
Auto merge of #5921 - Eh2406:proptest, r=alexcrichton
use proptest to fuzz the resolver
This has been a long time goal. This uses proptest to generate random registry indexes and throws them at the resolver.
It would be simple to generate a registry by,
1. make a list of name and version number each picked at random
2. for each pick a list of dependencies by making a list of name and version requirements at random.
Unfortunately, it would be extremely unlikely to generate any interesting cases, as the chance that the random name you depend on was also generated as the name of a crate is vanishingly small. So this implementation works very hard to ensure that it only generates valid dependency requirements.
This is still a WIP as it has many problems:
- [x] The current strategy is very convoluted. It is hard to see that it is correct, and harder to see how it can be expanded. Thanks to @centril for working with me on IRC to get this far. Do you have advice for improving it?
- [X] It is slow as molasses when run without release. I looked with a profilere and we seem to spend 2/3 of the time in `to_url`. Maybe we can special case `example.com` for test, like we do for `crates.io` or something? Edit: Done. `lazy_static` did its magic.
- [x] `proptest` does not yet work with `minimal-versions`, a taste of my own medicine.
- [x] I have not verified that, if I remove the fixes for other test that this regenerates them.
The current strategy does not:
- [x] generate interesting version numbers, it just dose 1.0.0, 2.0.0 ...
- [x] guarantee that the version requirements are possible to meet by the crate named.
- [ ] generate features.
- [ ] generate dev-dependencies.
- [x] build deep dependency trees, it seems to prefer to generate crates with 0 or 1 dependents so that on average the tree is 1 or 2 layers deep.
And last but not least, there are no interesting properties being tested. Like:
- [ ] If resolution was successful, then all the transitive requirements are met.
- [x] If resolution was successful, then unpublishing a version of a crate that was not selected should not change that.
- [x] If resolution was unsuccessful, then it should stay unsuccessful even if any version of a crate is unpublished.
- [ ] @maurer suggested testing for consistency. Same registry, same cargo version, same lockfile, every time.
- [ ] @maurer suggested a pareto optimality property (if all else stays the same, but new package versions are released, we don't get a new lockfile where every version is <= the old one, and at least one is < the old one)
Auto merge of #6086 - alexcrichton:build-plan-dev-deps, r=ehuss
Fix `--build-plan` with dev-dependencies
Regressed in #6005 it looks like the build plan requires all packages to
be downloaded rather than just those coming out of `unit_dependenices`,
so let's make sure to download everything!
Auto merge of #6085 - alexcrichton:progress, r=ehuss
Print a line per downloaded crate
This commit brings back the old one-line-per-crate output that Cargo's
had since the beginning of time but was removed in #6005. This was
requested on our [users forum][1]
Alex Crichton [Mon, 24 Sep 2018 16:24:20 +0000 (09:24 -0700)]
Fix `--build-plan` with dev-dependencies
Regressed in #6005 it looks like the build plan requires all packages to
be downloaded rather than just those coming out of `unit_dependenices`,
so let's make sure to download everything!
Fix missing messages when --message-format=json is deeply nested
This commit switches from `serde_json::Value` to [`RawValue`](https://docs.rs/serde_json/1.0/serde_json/value/struct.RawValue.html), which can process arbitrarily deeply nested JSON content without recursion.
Alex Crichton [Mon, 24 Sep 2018 15:59:25 +0000 (08:59 -0700)]
Print a line per downloaded crate
This commit brings back the old one-line-per-crate output that Cargo's
had since the beginning of time but was removed in #6005. This was
requested on our [users forum][1]
Auto merge of #6066 - zachlute:project-to-package-docs, r=alexcrichton
Replaced 'project' with 'package' in Cargo documentation.
Partial fix for #6056.
I tried to make a distinction between places that were talking about Cargo 'projects' vs. a generic concept of a 'project' or a github 'project'. It's entirely possible I was overzealous, so please tell me if some of these changes look dumb.
Auto merge of #6026 - alexcrichton:install-config, r=ehuss
Only load `~/.cargo/config` for `cargo install`
This commit tweaks how configuration is loaded for `cargo install`, ensuring
that we only load configuration from `$HOME` instead of the current working
directory. This should make installations a little more consistent in that they
probably shouldn't cover project-local configuration but should respect global
configuration!
Alex Crichton [Fri, 14 Sep 2018 17:42:27 +0000 (10:42 -0700)]
Only load `~/.cargo/config` for `cargo install`
This commit tweaks how configuration is loaded for `cargo install`, ensuring
that we only load configuration from `$HOME` instead of the current working
directory. This should make installations a little more consistent in that they
probably shouldn't cover project-local configuration but should respect global
configuration!
Auto merge of #6068 - eddyb:oopsie-daisy-stabilize, r=alexcrichton
Remove `fix::local_paths_no_fix`, as `crate_in_paths` is getting stabilized.
Needed for rust-lang/rust#54403 (blocking RC1).
Ideally we'd also clean up the tests, e.g. removing `#![feature(rust_2018_preview)]` and `is_nightly` checks, but it's easier to just remove the only failing test (because it tests the feature gate is needed).
Auto merge of #6005 - alexcrichton:download-parallel, r=ehuss
Download crates in parallel with HTTP/2
This PR revives some of the work of https://github.com/rust-lang/cargo/pull/5161 by refactoring Cargo to make it much easier to add parallel downloads, and then it does so with the `curl` crate's new `http2` feature to compile `nghttp2` has a backend.
The primary refactoring done here is to remove the concept of "download this one package" deep within a `Source`. Instead a `Source` still has a `download` method but it's considered to be largely non-blocking. If a crate needs to be downloaded it immediately returns information as to such. The `PackageSet` abstraction is now a central location for all parallel downloads, and all users of it have been refactored to be amenable to parallel downloads, when added.
Alex Crichton [Mon, 17 Sep 2018 18:59:37 +0000 (11:59 -0700)]
Avoid a debug overflow from curl
Curl doesn't guarantee that we'll get the total/current progress bars in a "as
we expect" fashion, so throw out weird data points (such as we've downloaded
more bytes than curl thinks we'll be downloading in total).
Alex Crichton [Mon, 17 Sep 2018 18:42:45 +0000 (11:42 -0700)]
Tweaks to retry logic
* Don't use `chain_err` internally inside `try` because then the spurious error
printed internally is bland and doesn't contain the right error information.
Instead place the `chain_err` on the outside so it propagates up to the user
still but doesn't get printed for spurious errors.
* Handle `pending_ids` management in the case of an error on the connection.
Alex Crichton [Thu, 13 Sep 2018 03:57:01 +0000 (20:57 -0700)]
Update the progress bar for parallel downloads
This is actually a super tricky problem. We don't really have the capacity for
more than one line of update-able information in Cargo right now, so we need to
squeeze a lot of information into one line of output for Cargo. The main
constraints this tries to satisfy are:
* At all times it should be clear what's happening. Cargo shouldn't just hang
with no output when downloading a crate for a long time, a counter ideally
needs to be decreasing while the download progresses.
* If a progress bar is shown, it shouldn't jump around. This ends up just being
a surprising user experience for most. Progress bars should only ever
increase, but they may increase at different speeds.
* Cargo has, currently, at most one line of output (as mentioned above) to pack
information into. We haven't delved into fancier terminal features that
involve multiple lines of update-able output.
* When downloading crates as part of `cargo build` (the norm) we don't actually
know ahead of time how many crates are being downloaded. We rely on the
calculation of unit dependencies to naturally feed into downloading more
crates.
* Furthermore, once we decide to download a crate, we don't actually know how
big it is! We have to wait for the server to tell us how big it is.
There doesn't really seem to be a great solution that satisfies all of these
constraints unfortunately. As a result this commit implements a relatively
conservative solution which should hopefully get us most of the way there. There
isn't actually a progress bar but rather Cargo prints that it's got N crates
left to download, and if it takes awhile it prints out that there are M bytes
remaining.
Unfortunately the progress is pretty choppy and jerky, not providing a smooth
UI. This appears to largely be because Cargo will synchronously extract
tarballs, which for large crates can cause a noticeable pause. Cargo's not
really prepared internally to perform this work on helper threads, but ideally
if it could do so it would improve the output quite a bit! (making it much
smoother and also able to account for the time tarball extraction takes).
Alex Crichton [Thu, 13 Sep 2018 02:24:33 +0000 (19:24 -0700)]
Only enable downloads through an RAII structure
This should help us scope downloads to ensure that we only ever have one
download session at once and it's explicitly scoped so we can handle the
progress bar and everything.
Alex Crichton [Tue, 11 Sep 2018 00:28:18 +0000 (17:28 -0700)]
Parallelize downloads with HTTP/2
This commit implements parallel downloads using `libcurl` powered by
`libnghttp2` over HTTP/2. Using all of the previous refactorings this actually
implements usage of `Multi` to download crates in parallel. This achieves some
large wins locally, taking download times from 30s to 2s in the best case.
The standard output of Cargo is also changed as a result of this commit. It's
no longer useful for Cargo to print "Downloading ..." for each crate really as
they all start instantaneously. Instead Cargo now no longer prints `Downloading`
by default (unless attached to a pipe) and instead only has one progress bar for
all downloads. Currently this progress bar is discrete and based on the total
number of downloads, no longer specifying how much of one particular download
has happened. This provides a less granular view into what Cargo is doing but
it's hoped that it looks reasonable from an outside perspective as there's
still a progress bar indicating what's happening.
Alex Crichton [Mon, 10 Sep 2018 23:50:29 +0000 (16:50 -0700)]
Refactor `build_unit_dependencies` for parallel downloads
This commit refactors the `build_unit_dependencies` function for leveraging
parallel downloads. This function is the primary location that crates are
download as part of `cargo build`, which is one of the most critical locations
to take advantage of parallel downloads!
The problem here is somewhat unusal in that we don't want to download the entire
crate graph but rather only the precise slice that we're compiling. This means
that we're letting the calculation of unit dependencies entirely drive the
decision of what crates to download. While we're calculating dependencies,
though, we may need to download more crates!
The strategy implemented here is to attempt to build the unit dependency graph.
Any missing packages are skipped during this construction and enqueued for
download. If any package was enqueued, we throw away the entire graph, wait for
a download to finish, and then attempt to construct the unit graph again.
The hope here is that we'll be able to incrementally make progress and
incrementally add more and more crates to be downloaded in parallel. The worry
is that this is a relatively inefficient strategy as we could reconstruct the
unit dependency graph N times (where you have N total transitive dependencies).
To help alleviate this concern wait for only a small handful of crates to
actively be downloading before we continue to try to recreate the dependency
graph. It's hoped that this will strike a good balance between ensuring we
always have a full pipeline of downloads without re-downloading too much.