cargo/sources/registry/mod.rs
1//! A `Source` for registry-based packages.
2//!
3//! # What's a Registry?
4//!
5//! [Registries] are central locations where packages can be uploaded to,
6//! discovered, and searched for. The purpose of a registry is to have a
7//! location that serves as permanent storage for versions of a crate over time.
8//!
9//! Compared to git sources (see [`GitSource`]), a registry provides many
10//! packages as well as many versions simultaneously. Git sources can also
11//! have commits deleted through rebasings where registries cannot have their
12//! versions deleted.
13//!
14//! In Cargo, [`RegistryData`] is an abstraction over each kind of actual
15//! registry, and [`RegistrySource`] connects those implementations to
16//! [`Source`] trait. Two prominent features these abstractions provide are
17//!
18//! * A way to query the metadata of a package from a registry. The metadata
19//! comes from the index.
20//! * A way to download package contents (a.k.a source files) that are required
21//! when building the package itself.
22//!
23//! We'll cover each functionality later.
24//!
25//! [Registries]: https://doc.rust-lang.org/nightly/cargo/reference/registries.html
26//! [`GitSource`]: super::GitSource
27//!
28//! # Different Kinds of Registries
29//!
30//! Cargo provides multiple kinds of registries. Each of them serves the index
31//! and package contents in a slightly different way. Namely,
32//!
33//! * [`LocalRegistry`] --- Serves the index and package contents entirely on
34//! a local filesystem.
35//! * [`RemoteRegistry`] --- Serves the index ahead of time from a Git
36//! repository, and package contents are downloaded as needed.
37//! * [`HttpRegistry`] --- Serves both the index and package contents on demand
38//! over a HTTP-based registry API. This is the default starting from 1.70.0.
39//!
40//! Each registry has its own [`RegistryData`] implementation, and can be
41//! created from either [`RegistrySource::local`] or [`RegistrySource::remote`].
42//!
43//! [`LocalRegistry`]: local::LocalRegistry
44//! [`RemoteRegistry`]: remote::RemoteRegistry
45//! [`HttpRegistry`]: http_remote::HttpRegistry
46//!
47//! # The Index of a Registry
48//!
49//! One of the major difficulties with a registry is that hosting so many
50//! packages may quickly run into performance problems when dealing with
51//! dependency graphs. It's infeasible for cargo to download the entire contents
52//! of the registry just to resolve one package's dependencies, for example. As
53//! a result, cargo needs some efficient method of querying what packages are
54//! available on a registry, what versions are available, and what the
55//! dependencies for each version is.
56//!
57//! To solve the problem, a registry must provide an index of package metadata.
58//! The index of a registry is essentially an easily query-able version of the
59//! registry's database for a list of versions of a package as well as a list
60//! of dependencies for each version. The exact format of the index is
61//! described later.
62//!
63//! See the [`index`] module for topics about the management, parsing, caching,
64//! and versioning for the on-disk index.
65//!
66//! ## The Format of The Index
67//!
68//! The index is a store for the list of versions for all packages known, so its
69//! format on disk is optimized slightly to ensure that `ls registry` doesn't
70//! produce a list of all packages ever known. The index also wants to ensure
71//! that there's not a million files which may actually end up hitting
72//! filesystem limits at some point. To this end, a few decisions were made
73//! about the format of the registry:
74//!
75//! 1. Each crate will have one file corresponding to it. Each version for a
76//! crate will just be a line in this file (see [`IndexPackage`] for its
77//! representation).
78//! 2. There will be two tiers of directories for crate names, under which
79//! crates corresponding to those tiers will be located.
80//! (See [`cargo_util::registry::make_dep_path`] for the implementation of
81//! this layout hierarchy.)
82//!
83//! As an example, this is an example hierarchy of an index:
84//!
85//! ```notrust
86//! .
87//! ├── 3
88//! │ └── u
89//! │ └── url
90//! ├── bz
91//! │ └── ip
92//! │ └── bzip2
93//! ├── config.json
94//! ├── en
95//! │ └── co
96//! │ └── encoding
97//! └── li
98//! ├── bg
99//! │ └── libgit2
100//! └── nk
101//! └── link-config
102//! ```
103//!
104//! The root of the index contains a `config.json` file with a few entries
105//! corresponding to the registry (see [`RegistryConfig`] below).
106//!
107//! Otherwise, there are three numbered directories (1, 2, 3) for crates with
108//! names 1, 2, and 3 characters in length. The 1/2 directories simply have the
109//! crate files underneath them, while the 3 directory is sharded by the first
110//! letter of the crate name.
111//!
112//! Otherwise the top-level directory contains many two-letter directory names,
113//! each of which has many sub-folders with two letters. At the end of all these
114//! are the actual crate files themselves.
115//!
116//! The purpose of this layout is to hopefully cut down on `ls` sizes as well as
117//! efficient lookup based on the crate name itself.
118//!
119//! See [The Cargo Book: Registry Index][registry-index] for the public
120//! interface on the index format.
121//!
122//! [registry-index]: https://doc.rust-lang.org/nightly/cargo/reference/registry-index.html
123//!
124//! ## The Index Files
125//!
126//! Each file in the index is the history of one crate over time. Each line in
127//! the file corresponds to one version of a crate, stored in JSON format (see
128//! the [`IndexPackage`] structure).
129//!
130//! As new versions are published, new lines are appended to this file. **The
131//! only modifications to this file that should happen over time are yanks of a
132//! particular version.**
133//!
134//! # Downloading Packages
135//!
136//! The purpose of the index was to provide an efficient method to resolve the
137//! dependency graph for a package. After resolution has been performed, we need
138//! to download the contents of packages so we can read the full manifest and
139//! build the source code.
140//!
141//! To accomplish this, [`RegistryData::download`] will "make" an HTTP request
142//! per-package requested to download tarballs into a local cache. These
143//! tarballs will then be unpacked into a destination folder.
144//!
145//! Note that because versions uploaded to the registry are frozen forever that
146//! the HTTP download and unpacking can all be skipped if the version has
147//! already been downloaded and unpacked. This caching allows us to only
148//! download a package when absolutely necessary.
149//!
150//! # Filesystem Hierarchy
151//!
152//! Overall, the `$HOME/.cargo` looks like this when talking about the registry
153//! (remote registries, specifically):
154//!
155//! ```notrust
156//! # A folder under which all registry metadata is hosted (similar to
157//! # $HOME/.cargo/git)
158//! $HOME/.cargo/registry/
159//!
160//! # For each registry that cargo knows about (keyed by hostname + hash)
161//! # there is a folder which is the checked out version of the index for
162//! # the registry in this location. Note that this is done so cargo can
163//! # support multiple registries simultaneously
164//! index/
165//! registry1-<hash>/
166//! registry2-<hash>/
167//! ...
168//!
169//! # This folder is a cache for all downloaded tarballs (`.crate` file)
170//! # from a registry. Once downloaded and verified, a tarball never changes.
171//! cache/
172//! registry1-<hash>/<pkg>-<version>.crate
173//! ...
174//!
175//! # Location in which all tarballs are unpacked. Each tarball is known to
176//! # be frozen after downloading, so transitively this folder is also
177//! # frozen once its unpacked (it's never unpacked again)
178//! # CAVEAT: They are not read-only. See rust-lang/cargo#9455.
179//! src/
180//! registry1-<hash>/<pkg>-<version>/...
181//! ...
182//! ```
183//!
184//! [`IndexPackage`]: index::IndexPackage
185
186use std::collections::HashSet;
187use std::fs;
188use std::fs::{File, OpenOptions};
189use std::io;
190use std::io::Read;
191use std::io::Write;
192use std::path::{Path, PathBuf};
193use std::task::{ready, Poll};
194
195use anyhow::Context as _;
196use cargo_util::paths::{self, exclude_from_backups_and_indexing};
197use flate2::read::GzDecoder;
198use serde::Deserialize;
199use serde::Serialize;
200use tar::Archive;
201use tracing::debug;
202
203use crate::core::dependency::Dependency;
204use crate::core::global_cache_tracker;
205use crate::core::{Package, PackageId, SourceId};
206use crate::sources::source::MaybePackage;
207use crate::sources::source::QueryKind;
208use crate::sources::source::Source;
209use crate::sources::PathSource;
210use crate::util::cache_lock::CacheLockMode;
211use crate::util::interning::InternedString;
212use crate::util::network::PollExt;
213use crate::util::{hex, VersionExt};
214use crate::util::{restricted_names, CargoResult, Filesystem, GlobalContext, LimitErrorReader};
215
216/// The `.cargo-ok` file is used to track if the source is already unpacked.
217/// See [`RegistrySource::unpack_package`] for more.
218///
219/// Not to be confused with `.cargo-ok` file in git sources.
220const PACKAGE_SOURCE_LOCK: &str = ".cargo-ok";
221
222pub const CRATES_IO_INDEX: &str = "https://github.com/rust-lang/crates.io-index";
223pub const CRATES_IO_HTTP_INDEX: &str = "sparse+https://index.crates.io/";
224pub const CRATES_IO_REGISTRY: &str = "crates-io";
225pub const CRATES_IO_DOMAIN: &str = "crates.io";
226
227/// The content inside `.cargo-ok`.
228/// See [`RegistrySource::unpack_package`] for more.
229#[derive(Deserialize, Serialize)]
230#[serde(rename_all = "kebab-case")]
231struct LockMetadata {
232 /// The version of `.cargo-ok` file
233 v: u32,
234}
235
236/// A [`Source`] implementation for a local or a remote registry.
237///
238/// This contains common functionality that is shared between each registry
239/// kind, with the registry-specific logic implemented as part of the
240/// [`RegistryData`] trait referenced via the `ops` field.
241///
242/// For general concepts of registries, see the [module-level documentation](crate::sources::registry).
243pub struct RegistrySource<'gctx> {
244 /// A unique name of the source (typically used as the directory name
245 /// where its cached content is stored).
246 name: InternedString,
247 /// The unique identifier of this source.
248 source_id: SourceId,
249 /// The path where crate files are extracted (`$CARGO_HOME/registry/src/$REG-HASH`).
250 src_path: Filesystem,
251 /// Path to the cache of `.crate` files (`$CARGO_HOME/registry/cache/$REG-HASH`).
252 cache_path: Filesystem,
253 /// Local reference to [`GlobalContext`] for convenience.
254 gctx: &'gctx GlobalContext,
255 /// Abstraction for interfacing to the different registry kinds.
256 ops: Box<dyn RegistryData + 'gctx>,
257 /// Interface for managing the on-disk index.
258 index: index::RegistryIndex<'gctx>,
259 /// A set of packages that should be allowed to be used, even if they are
260 /// yanked.
261 ///
262 /// This is populated from the entries in `Cargo.lock` to ensure that
263 /// `cargo update somepkg` won't unlock yanked entries in `Cargo.lock`.
264 /// Otherwise, the resolver would think that those entries no longer
265 /// exist, and it would trigger updates to unrelated packages.
266 yanked_whitelist: HashSet<PackageId>,
267 /// Yanked versions that have already been selected during queries.
268 ///
269 /// As of this writing, this is for not emitting the `--precise <yanked>`
270 /// warning twice, with the assumption of (`dep.package_name()` + `--precise`
271 /// version) being sufficient to uniquely identify the same query result.
272 selected_precise_yanked: HashSet<(InternedString, semver::Version)>,
273}
274
275/// The [`config.json`] file stored in the index.
276///
277/// The config file may look like:
278///
279/// ```json
280/// {
281/// "dl": "https://example.com/api/{crate}/{version}/download",
282/// "api": "https://example.com/api",
283/// "auth-required": false # unstable feature (RFC 3139)
284/// }
285/// ```
286///
287/// [`config.json`]: https://doc.rust-lang.org/nightly/cargo/reference/registry-index.html#index-configuration
288#[derive(Deserialize, Debug, Clone)]
289#[serde(rename_all = "kebab-case")]
290pub struct RegistryConfig {
291 /// Download endpoint for all crates.
292 ///
293 /// The string is a template which will generate the download URL for the
294 /// tarball of a specific version of a crate. The substrings `{crate}` and
295 /// `{version}` will be replaced with the crate's name and version
296 /// respectively. The substring `{prefix}` will be replaced with the
297 /// crate's prefix directory name, and the substring `{lowerprefix}` will
298 /// be replaced with the crate's prefix directory name converted to
299 /// lowercase. The substring `{sha256-checksum}` will be replaced with the
300 /// crate's sha256 checksum.
301 ///
302 /// For backwards compatibility, if the string does not contain any
303 /// markers (`{crate}`, `{version}`, `{prefix}`, or `{lowerprefix}`), it
304 /// will be extended with `/{crate}/{version}/download` to
305 /// support registries like crates.io which were created before the
306 /// templating setup was created.
307 ///
308 /// For more on the template of the download URL, see [Index Configuration](
309 /// https://doc.rust-lang.org/nightly/cargo/reference/registry-index.html#index-configuration).
310 pub dl: String,
311
312 /// API endpoint for the registry. This is what's actually hit to perform
313 /// operations like yanks, owner modifications, publish new crates, etc.
314 /// If this is None, the registry does not support API commands.
315 pub api: Option<String>,
316
317 /// Whether all operations require authentication. See [RFC 3139].
318 ///
319 /// [RFC 3139]: https://rust-lang.github.io/rfcs/3139-cargo-alternative-registry-auth.html
320 #[serde(default)]
321 pub auth_required: bool,
322}
323
324/// Result from loading data from a registry.
325pub enum LoadResponse {
326 /// The cache is valid. The cached data should be used.
327 CacheValid,
328
329 /// The cache is out of date. Returned data should be used.
330 Data {
331 raw_data: Vec<u8>,
332 /// Version of this data to determine whether it is out of date.
333 index_version: Option<String>,
334 },
335
336 /// The requested crate was found.
337 NotFound,
338}
339
340/// An abstract interface to handle both a local and remote registry.
341///
342/// This allows [`RegistrySource`] to abstractly handle each registry kind.
343///
344/// For general concepts of registries, see the [module-level documentation](crate::sources::registry).
345pub trait RegistryData {
346 /// Performs initialization for the registry.
347 ///
348 /// This should be safe to call multiple times, the implementation is
349 /// expected to not do any work if it is already prepared.
350 fn prepare(&self) -> CargoResult<()>;
351
352 /// Returns the path to the index.
353 ///
354 /// Note that different registries store the index in different formats
355 /// (remote = git, http & local = files).
356 fn index_path(&self) -> &Filesystem;
357
358 /// Loads the JSON for a specific named package from the index.
359 ///
360 /// * `root` is the root path to the index.
361 /// * `path` is the relative path to the package to load (like `ca/rg/cargo`).
362 /// * `index_version` is the version of the requested crate data currently
363 /// in cache. This is useful for checking if a local cache is outdated.
364 fn load(
365 &mut self,
366 root: &Path,
367 path: &Path,
368 index_version: Option<&str>,
369 ) -> Poll<CargoResult<LoadResponse>>;
370
371 /// Loads the `config.json` file and returns it.
372 ///
373 /// Local registries don't have a config, and return `None`.
374 fn config(&mut self) -> Poll<CargoResult<Option<RegistryConfig>>>;
375
376 /// Invalidates locally cached data.
377 fn invalidate_cache(&mut self);
378
379 /// If quiet, the source should not display any progress or status messages.
380 fn set_quiet(&mut self, quiet: bool);
381
382 /// Is the local cached data up-to-date?
383 fn is_updated(&self) -> bool;
384
385 /// Prepare to start downloading a `.crate` file.
386 ///
387 /// Despite the name, this doesn't actually download anything. If the
388 /// `.crate` is already downloaded, then it returns [`MaybeLock::Ready`].
389 /// If it hasn't been downloaded, then it returns [`MaybeLock::Download`]
390 /// which contains the URL to download. The [`crate::core::package::Downloads`]
391 /// system handles the actual download process. After downloading, it
392 /// calls [`Self::finish_download`] to save the downloaded file.
393 ///
394 /// `checksum` is currently only used by local registries to verify the
395 /// file contents (because local registries never actually download
396 /// anything). Remote registries will validate the checksum in
397 /// `finish_download`. For already downloaded `.crate` files, it does not
398 /// validate the checksum, assuming the filesystem does not suffer from
399 /// corruption or manipulation.
400 fn download(&mut self, pkg: PackageId, checksum: &str) -> CargoResult<MaybeLock>;
401
402 /// Finish a download by saving a `.crate` file to disk.
403 ///
404 /// After [`crate::core::package::Downloads`] has finished a download,
405 /// it will call this to save the `.crate` file. This is only relevant
406 /// for remote registries. This should validate the checksum and save
407 /// the given data to the on-disk cache.
408 ///
409 /// Returns a [`File`] handle to the `.crate` file, positioned at the start.
410 fn finish_download(&mut self, pkg: PackageId, checksum: &str, data: &[u8])
411 -> CargoResult<File>;
412
413 /// Returns whether or not the `.crate` file is already downloaded.
414 fn is_crate_downloaded(&self, _pkg: PackageId) -> bool {
415 true
416 }
417
418 /// Validates that the global package cache lock is held.
419 ///
420 /// Given the [`Filesystem`], this will make sure that the package cache
421 /// lock is held. If not, it will panic. See
422 /// [`GlobalContext::acquire_package_cache_lock`] for acquiring the global lock.
423 ///
424 /// Returns the [`Path`] to the [`Filesystem`].
425 fn assert_index_locked<'a>(&self, path: &'a Filesystem) -> &'a Path;
426
427 /// Block until all outstanding `Poll::Pending` requests are `Poll::Ready`.
428 fn block_until_ready(&mut self) -> CargoResult<()>;
429}
430
431/// The status of [`RegistryData::download`] which indicates if a `.crate`
432/// file has already been downloaded, or if not then the URL to download.
433pub enum MaybeLock {
434 /// The `.crate` file is already downloaded. [`File`] is a handle to the
435 /// opened `.crate` file on the filesystem.
436 Ready(File),
437 /// The `.crate` file is not downloaded, here's the URL to download it from.
438 ///
439 /// `descriptor` is just a text string to display to the user of what is
440 /// being downloaded.
441 Download {
442 url: String,
443 descriptor: String,
444 authorization: Option<String>,
445 },
446}
447
448mod download;
449mod http_remote;
450pub(crate) mod index;
451pub use index::IndexSummary;
452mod local;
453mod remote;
454
455/// Generates a unique name for [`SourceId`] to have a unique path to put their
456/// index files.
457fn short_name(id: SourceId, is_shallow: bool) -> String {
458 // CAUTION: This should not change between versions. If you change how
459 // this is computed, it will orphan previously cached data, forcing the
460 // cache to be rebuilt and potentially wasting significant disk space. If
461 // you change it, be cautious of the impact. See `test_cratesio_hash` for
462 // a similar discussion.
463 let hash = hex::short_hash(&id);
464 let ident = id.url().host_str().unwrap_or("").to_string();
465 let mut name = format!("{}-{}", ident, hash);
466 if is_shallow {
467 name.push_str("-shallow");
468 }
469 name
470}
471
472impl<'gctx> RegistrySource<'gctx> {
473 /// Creates a [`Source`] of a "remote" registry.
474 /// It could be either an HTTP-based [`http_remote::HttpRegistry`] or
475 /// a Git-based [`remote::RemoteRegistry`].
476 ///
477 /// * `yanked_whitelist` --- Packages allowed to be used, even if they are yanked.
478 pub fn remote(
479 source_id: SourceId,
480 yanked_whitelist: &HashSet<PackageId>,
481 gctx: &'gctx GlobalContext,
482 ) -> CargoResult<RegistrySource<'gctx>> {
483 assert!(source_id.is_remote_registry());
484 let name = short_name(
485 source_id,
486 gctx.cli_unstable()
487 .git
488 .map_or(false, |features| features.shallow_index)
489 && !source_id.is_sparse(),
490 );
491 let ops = if source_id.is_sparse() {
492 Box::new(http_remote::HttpRegistry::new(source_id, gctx, &name)?) as Box<_>
493 } else {
494 Box::new(remote::RemoteRegistry::new(source_id, gctx, &name)) as Box<_>
495 };
496
497 Ok(RegistrySource::new(
498 source_id,
499 gctx,
500 &name,
501 ops,
502 yanked_whitelist,
503 ))
504 }
505
506 /// Creates a [`Source`] of a local registry, with [`local::LocalRegistry`] under the hood.
507 ///
508 /// * `path` --- The root path of a local registry on the file system.
509 /// * `yanked_whitelist` --- Packages allowed to be used, even if they are yanked.
510 pub fn local(
511 source_id: SourceId,
512 path: &Path,
513 yanked_whitelist: &HashSet<PackageId>,
514 gctx: &'gctx GlobalContext,
515 ) -> RegistrySource<'gctx> {
516 let name = short_name(source_id, false);
517 let ops = local::LocalRegistry::new(path, gctx, &name);
518 RegistrySource::new(source_id, gctx, &name, Box::new(ops), yanked_whitelist)
519 }
520
521 /// Creates a source of a registry. This is a inner helper function.
522 ///
523 /// * `name` --- Name of a path segment which may affect where `.crate`
524 /// tarballs, the registry index and cache are stored. Expect to be unique.
525 /// * `ops` --- The underlying [`RegistryData`] type.
526 /// * `yanked_whitelist` --- Packages allowed to be used, even if they are yanked.
527 fn new(
528 source_id: SourceId,
529 gctx: &'gctx GlobalContext,
530 name: &str,
531 ops: Box<dyn RegistryData + 'gctx>,
532 yanked_whitelist: &HashSet<PackageId>,
533 ) -> RegistrySource<'gctx> {
534 RegistrySource {
535 name: name.into(),
536 src_path: gctx.registry_source_path().join(name),
537 cache_path: gctx.registry_cache_path().join(name),
538 gctx,
539 source_id,
540 index: index::RegistryIndex::new(source_id, ops.index_path(), gctx),
541 yanked_whitelist: yanked_whitelist.clone(),
542 ops,
543 selected_precise_yanked: HashSet::new(),
544 }
545 }
546
547 /// Decode the [configuration](RegistryConfig) stored within the registry.
548 ///
549 /// This requires that the index has been at least checked out.
550 pub fn config(&mut self) -> Poll<CargoResult<Option<RegistryConfig>>> {
551 self.ops.config()
552 }
553
554 /// Unpacks a downloaded package into a location where it's ready to be
555 /// compiled.
556 ///
557 /// No action is taken if the source looks like it's already unpacked.
558 ///
559 /// # History of interruption detection with `.cargo-ok` file
560 ///
561 /// Cargo has always included a `.cargo-ok` file ([`PACKAGE_SOURCE_LOCK`])
562 /// to detect if extraction was interrupted, but it was originally empty.
563 ///
564 /// In 1.34, Cargo was changed to create the `.cargo-ok` file before it
565 /// started extraction to implement fine-grained locking. After it was
566 /// finished extracting, it wrote two bytes to indicate it was complete.
567 /// It would use the length check to detect if it was possibly interrupted.
568 ///
569 /// In 1.36, Cargo changed to not use fine-grained locking, and instead used
570 /// a global lock. The use of `.cargo-ok` was no longer needed for locking
571 /// purposes, but was kept to detect when extraction was interrupted.
572 ///
573 /// In 1.49, Cargo changed to not create the `.cargo-ok` file before it
574 /// started extraction to deal with `.crate` files that inexplicably had
575 /// a `.cargo-ok` file in them.
576 ///
577 /// In 1.64, Cargo changed to detect `.crate` files with `.cargo-ok` files
578 /// in them in response to [CVE-2022-36113], which dealt with malicious
579 /// `.crate` files making `.cargo-ok` a symlink causing cargo to write "ok"
580 /// to any arbitrary file on the filesystem it has permission to.
581 ///
582 /// In 1.71, `.cargo-ok` changed to contain a JSON `{ v: 1 }` to indicate
583 /// the version of it. A failure of parsing will result in a heavy-hammer
584 /// approach that unpacks the `.crate` file again. This is in response to a
585 /// security issue that the unpacking didn't respect umask on Unix systems.
586 ///
587 /// This is all a long-winded way of explaining the circumstances that might
588 /// cause a directory to contain a `.cargo-ok` file that is empty or
589 /// otherwise corrupted. Either this was extracted by a version of Rust
590 /// before 1.34, in which case everything should be fine. However, an empty
591 /// file created by versions 1.36 to 1.49 indicates that the extraction was
592 /// interrupted and that we need to start again.
593 ///
594 /// Another possibility is that the filesystem is simply corrupted, in
595 /// which case deleting the directory might be the safe thing to do. That
596 /// is probably unlikely, though.
597 ///
598 /// To be safe, we deletes the directory and starts over again if an empty
599 /// `.cargo-ok` file is found.
600 ///
601 /// [CVE-2022-36113]: https://blog.rust-lang.org/2022/09/14/cargo-cves.html#arbitrary-file-corruption-cve-2022-36113
602 fn unpack_package(&self, pkg: PackageId, tarball: &File) -> CargoResult<PathBuf> {
603 let package_dir = format!("{}-{}", pkg.name(), pkg.version());
604 let dst = self.src_path.join(&package_dir);
605 let path = dst.join(PACKAGE_SOURCE_LOCK);
606 let path = self
607 .gctx
608 .assert_package_cache_locked(CacheLockMode::DownloadExclusive, &path);
609 let unpack_dir = path.parent().unwrap();
610 match fs::read_to_string(path) {
611 Ok(ok) => match serde_json::from_str::<LockMetadata>(&ok) {
612 Ok(lock_meta) if lock_meta.v == 1 => {
613 self.gctx
614 .deferred_global_last_use()?
615 .mark_registry_src_used(global_cache_tracker::RegistrySrc {
616 encoded_registry_name: self.name,
617 package_dir: package_dir.into(),
618 size: None,
619 });
620 return Ok(unpack_dir.to_path_buf());
621 }
622 _ => {
623 if ok == "ok" {
624 tracing::debug!("old `ok` content found, clearing cache");
625 } else {
626 tracing::warn!("unrecognized .cargo-ok content, clearing cache: {ok}");
627 }
628 // See comment of `unpack_package` about why removing all stuff.
629 paths::remove_dir_all(dst.as_path_unlocked())?;
630 }
631 },
632 Err(e) if e.kind() == io::ErrorKind::NotFound => {}
633 Err(e) => anyhow::bail!("unable to read .cargo-ok file at {path:?}: {e}"),
634 }
635 dst.create_dir()?;
636
637 let bytes_written = unpack(self.gctx, tarball, unpack_dir, &|_| true)?;
638
639 // Now that we've finished unpacking, create and write to the lock file to indicate that
640 // unpacking was successful.
641 let mut ok = OpenOptions::new()
642 .create_new(true)
643 .read(true)
644 .write(true)
645 .open(&path)
646 .with_context(|| format!("failed to open `{}`", path.display()))?;
647
648 let lock_meta = LockMetadata { v: 1 };
649 write!(ok, "{}", serde_json::to_string(&lock_meta).unwrap())?;
650
651 self.gctx
652 .deferred_global_last_use()?
653 .mark_registry_src_used(global_cache_tracker::RegistrySrc {
654 encoded_registry_name: self.name,
655 package_dir: package_dir.into(),
656 size: Some(bytes_written),
657 });
658
659 Ok(unpack_dir.to_path_buf())
660 }
661
662 /// Unpacks the `.crate` tarball of the package in a given directory.
663 ///
664 /// Returns the path to the crate tarball directory,
665 /// whch is always `<unpack_dir>/<pkg>-<version>`.
666 ///
667 /// This holds an assumption that the associated tarball already exists.
668 pub fn unpack_package_in(
669 &self,
670 pkg: &PackageId,
671 unpack_dir: &Path,
672 include: &dyn Fn(&Path) -> bool,
673 ) -> CargoResult<PathBuf> {
674 let path = self.cache_path.join(pkg.tarball_name());
675 let path = self
676 .gctx
677 .assert_package_cache_locked(CacheLockMode::DownloadExclusive, &path);
678 let dst = unpack_dir.join(format!("{}-{}", pkg.name(), pkg.version()));
679 let tarball =
680 File::open(path).with_context(|| format!("failed to open {}", path.display()))?;
681 unpack(self.gctx, &tarball, &dst, include)?;
682 Ok(dst)
683 }
684
685 /// Turns the downloaded `.crate` tarball file into a [`Package`].
686 ///
687 /// This unconditionally sets checksum for the returned package, so it
688 /// should only be called after doing integrity check. That is to say,
689 /// you need to call either [`RegistryData::download`] or
690 /// [`RegistryData::finish_download`] before calling this method.
691 fn get_pkg(&mut self, package: PackageId, path: &File) -> CargoResult<Package> {
692 let path = self
693 .unpack_package(package, path)
694 .with_context(|| format!("failed to unpack package `{}`", package))?;
695 let mut src = PathSource::new(&path, self.source_id, self.gctx);
696 src.load()?;
697 let mut pkg = match src.download(package)? {
698 MaybePackage::Ready(pkg) => pkg,
699 MaybePackage::Download { .. } => unreachable!(),
700 };
701
702 // After we've loaded the package configure its summary's `checksum`
703 // field with the checksum we know for this `PackageId`.
704 let cksum = self
705 .index
706 .hash(package, &mut *self.ops)
707 .expect("a downloaded dep now pending!?")
708 .expect("summary not found");
709 pkg.manifest_mut()
710 .summary_mut()
711 .set_checksum(cksum.to_string());
712
713 Ok(pkg)
714 }
715}
716
717impl<'gctx> Source for RegistrySource<'gctx> {
718 fn query(
719 &mut self,
720 dep: &Dependency,
721 kind: QueryKind,
722 f: &mut dyn FnMut(IndexSummary),
723 ) -> Poll<CargoResult<()>> {
724 let mut req = dep.version_req().clone();
725
726 // Handle `cargo update --precise` here.
727 if let Some((_, requested)) = self
728 .source_id
729 .precise_registry_version(dep.package_name().as_str())
730 .filter(|(c, to)| {
731 if to.is_prerelease() && self.gctx.cli_unstable().unstable_options {
732 req.matches_prerelease(c)
733 } else {
734 req.matches(c)
735 }
736 })
737 {
738 req.precise_to(&requested);
739 }
740
741 let mut called = false;
742 let callback = &mut |s| {
743 called = true;
744 f(s);
745 };
746
747 // If this is a locked dependency, then it came from a lock file and in
748 // theory the registry is known to contain this version. If, however, we
749 // come back with no summaries, then our registry may need to be
750 // updated, so we fall back to performing a lazy update.
751 if kind == QueryKind::Exact && req.is_locked() && !self.ops.is_updated() {
752 debug!("attempting query without update");
753 ready!(self
754 .index
755 .query_inner(dep.package_name(), &req, &mut *self.ops, &mut |s| {
756 if matches!(s, IndexSummary::Candidate(_) | IndexSummary::Yanked(_))
757 && dep.matches(s.as_summary())
758 {
759 // We are looking for a package from a lock file so we do not care about yank
760 callback(s)
761 }
762 },))?;
763 if called {
764 Poll::Ready(Ok(()))
765 } else {
766 debug!("falling back to an update");
767 self.invalidate_cache();
768 Poll::Pending
769 }
770 } else {
771 let mut precise_yanked_in_use = false;
772 ready!(self
773 .index
774 .query_inner(dep.package_name(), &req, &mut *self.ops, &mut |s| {
775 let matched = match kind {
776 QueryKind::Exact | QueryKind::RejectedVersions => {
777 if req.is_precise() && self.gctx.cli_unstable().unstable_options {
778 dep.matches_prerelease(s.as_summary())
779 } else {
780 dep.matches(s.as_summary())
781 }
782 }
783 QueryKind::AlternativeNames => true,
784 QueryKind::Normalized => true,
785 };
786 if !matched {
787 return;
788 }
789 // Next filter out all yanked packages. Some yanked packages may
790 // leak through if they're in a whitelist (aka if they were
791 // previously in `Cargo.lock`
792 match s {
793 s @ _ if kind == QueryKind::RejectedVersions => callback(s),
794 s @ IndexSummary::Candidate(_) => callback(s),
795 s @ IndexSummary::Yanked(_) => {
796 if self.yanked_whitelist.contains(&s.package_id()) {
797 callback(s);
798 } else if req.is_precise() {
799 precise_yanked_in_use = true;
800 callback(s);
801 }
802 }
803 IndexSummary::Unsupported(summary, v) => {
804 tracing::debug!(
805 "unsupported schema version {} ({} {})",
806 v,
807 summary.name(),
808 summary.version()
809 );
810 }
811 IndexSummary::Invalid(summary) => {
812 tracing::debug!("invalid ({} {})", summary.name(), summary.version());
813 }
814 IndexSummary::Offline(summary) => {
815 tracing::debug!("offline ({} {})", summary.name(), summary.version());
816 }
817 }
818 }))?;
819 if precise_yanked_in_use {
820 let name = dep.package_name();
821 let version = req
822 .precise_version()
823 .expect("--precise <yanked-version> in use");
824 if self.selected_precise_yanked.insert((name, version.clone())) {
825 let mut shell = self.gctx.shell();
826 shell.warn(format_args!(
827 "selected package `{name}@{version}` was yanked by the author"
828 ))?;
829 shell.note("if possible, try a compatible non-yanked version")?;
830 }
831 }
832 if called {
833 return Poll::Ready(Ok(()));
834 }
835 let mut any_pending = false;
836 if kind == QueryKind::AlternativeNames || kind == QueryKind::Normalized {
837 // Attempt to handle misspellings by searching for a chain of related
838 // names to the original name. The resolver will later
839 // reject any candidates that have the wrong name, and with this it'll
840 // along the way produce helpful "did you mean?" suggestions.
841 // For now we only try the canonical lysing `-` to `_` and vice versa.
842 // More advanced fuzzy searching become in the future.
843 for name_permutation in [
844 dep.package_name().replace('-', "_"),
845 dep.package_name().replace('_', "-"),
846 ] {
847 let name_permutation = InternedString::new(&name_permutation);
848 if name_permutation == dep.package_name() {
849 continue;
850 }
851 any_pending |= self
852 .index
853 .query_inner(name_permutation, &req, &mut *self.ops, &mut |s| {
854 if !s.is_yanked() {
855 f(s);
856 } else if kind == QueryKind::AlternativeNames {
857 f(s);
858 }
859 })?
860 .is_pending();
861 }
862 }
863 if any_pending {
864 Poll::Pending
865 } else {
866 Poll::Ready(Ok(()))
867 }
868 }
869 }
870
871 fn supports_checksums(&self) -> bool {
872 true
873 }
874
875 fn requires_precise(&self) -> bool {
876 false
877 }
878
879 fn source_id(&self) -> SourceId {
880 self.source_id
881 }
882
883 fn invalidate_cache(&mut self) {
884 self.index.clear_summaries_cache();
885 self.ops.invalidate_cache();
886 }
887
888 fn set_quiet(&mut self, quiet: bool) {
889 self.ops.set_quiet(quiet);
890 }
891
892 fn download(&mut self, package: PackageId) -> CargoResult<MaybePackage> {
893 let hash = loop {
894 match self.index.hash(package, &mut *self.ops)? {
895 Poll::Pending => self.block_until_ready()?,
896 Poll::Ready(hash) => break hash,
897 }
898 };
899 match self.ops.download(package, hash)? {
900 MaybeLock::Ready(file) => self.get_pkg(package, &file).map(MaybePackage::Ready),
901 MaybeLock::Download {
902 url,
903 descriptor,
904 authorization,
905 } => Ok(MaybePackage::Download {
906 url,
907 descriptor,
908 authorization,
909 }),
910 }
911 }
912
913 fn finish_download(&mut self, package: PackageId, data: Vec<u8>) -> CargoResult<Package> {
914 let hash = loop {
915 match self.index.hash(package, &mut *self.ops)? {
916 Poll::Pending => self.block_until_ready()?,
917 Poll::Ready(hash) => break hash,
918 }
919 };
920 let file = self.ops.finish_download(package, hash, &data)?;
921 self.get_pkg(package, &file)
922 }
923
924 fn fingerprint(&self, pkg: &Package) -> CargoResult<String> {
925 Ok(pkg.package_id().version().to_string())
926 }
927
928 fn describe(&self) -> String {
929 self.source_id.display_index()
930 }
931
932 fn add_to_yanked_whitelist(&mut self, pkgs: &[PackageId]) {
933 self.yanked_whitelist.extend(pkgs);
934 }
935
936 fn is_yanked(&mut self, pkg: PackageId) -> Poll<CargoResult<bool>> {
937 self.index.is_yanked(pkg, &mut *self.ops)
938 }
939
940 fn block_until_ready(&mut self) -> CargoResult<()> {
941 // Before starting to work on the registry, make sure that
942 // `<cargo_home>/registry` is marked as excluded from indexing and
943 // backups. Older versions of Cargo didn't do this, so we do it here
944 // regardless of whether `<cargo_home>` exists.
945 //
946 // This does not use `create_dir_all_excluded_from_backups_atomic` for
947 // the same reason: we want to exclude it even if the directory already
948 // exists.
949 //
950 // IO errors in creating and marking it are ignored, e.g. in case we're on a
951 // read-only filesystem.
952 let registry_base = self.gctx.registry_base_path();
953 let _ = registry_base.create_dir();
954 exclude_from_backups_and_indexing(®istry_base.into_path_unlocked());
955
956 self.ops.block_until_ready()
957 }
958}
959
960impl RegistryConfig {
961 /// File name of [`RegistryConfig`].
962 const NAME: &'static str = "config.json";
963}
964
965/// Get the maximum unpack size that Cargo permits
966/// based on a given `size` of your compressed file.
967///
968/// Returns the larger one between `size * max compression ratio`
969/// and a fixed max unpacked size.
970///
971/// In reality, the compression ratio usually falls in the range of 2:1 to 10:1.
972/// We choose 20:1 to cover almost all possible cases hopefully.
973/// Any ratio higher than this is considered as a zip bomb.
974///
975/// In the future we might want to introduce a configurable size.
976///
977/// Some of the real world data from common compression algorithms:
978///
979/// * <https://www.zlib.net/zlib_tech.html>
980/// * <https://cran.r-project.org/web/packages/brotli/vignettes/brotli-2015-09-22.pdf>
981/// * <https://blog.cloudflare.com/results-experimenting-brotli/>
982/// * <https://tukaani.org/lzma/benchmarks.html>
983fn max_unpack_size(gctx: &GlobalContext, size: u64) -> u64 {
984 const SIZE_VAR: &str = "__CARGO_TEST_MAX_UNPACK_SIZE";
985 const RATIO_VAR: &str = "__CARGO_TEST_MAX_UNPACK_RATIO";
986 const MAX_UNPACK_SIZE: u64 = 512 * 1024 * 1024; // 512 MiB
987 const MAX_COMPRESSION_RATIO: usize = 20; // 20:1
988
989 let max_unpack_size = if cfg!(debug_assertions) && gctx.get_env(SIZE_VAR).is_ok() {
990 // For integration test only.
991 gctx.get_env(SIZE_VAR)
992 .unwrap()
993 .parse()
994 .expect("a max unpack size in bytes")
995 } else {
996 MAX_UNPACK_SIZE
997 };
998 let max_compression_ratio = if cfg!(debug_assertions) && gctx.get_env(RATIO_VAR).is_ok() {
999 // For integration test only.
1000 gctx.get_env(RATIO_VAR)
1001 .unwrap()
1002 .parse()
1003 .expect("a max compression ratio in bytes")
1004 } else {
1005 MAX_COMPRESSION_RATIO
1006 };
1007
1008 u64::max(max_unpack_size, size * max_compression_ratio as u64)
1009}
1010
1011/// Set the current [`umask`] value for the given tarball. No-op on non-Unix
1012/// platforms.
1013///
1014/// On Windows, tar only looks at user permissions and tries to set the "read
1015/// only" attribute, so no-op as well.
1016///
1017/// [`umask`]: https://man7.org/linux/man-pages/man2/umask.2.html
1018#[allow(unused_variables)]
1019fn set_mask<R: Read>(tar: &mut Archive<R>) {
1020 #[cfg(unix)]
1021 tar.set_mask(crate::util::get_umask());
1022}
1023
1024/// Unpack a tarball with zip bomb and overwrite protections.
1025fn unpack(
1026 gctx: &GlobalContext,
1027 tarball: &File,
1028 unpack_dir: &Path,
1029 include: &dyn Fn(&Path) -> bool,
1030) -> CargoResult<u64> {
1031 let mut tar = {
1032 let size_limit = max_unpack_size(gctx, tarball.metadata()?.len());
1033 let gz = GzDecoder::new(tarball);
1034 let gz = LimitErrorReader::new(gz, size_limit);
1035 let mut tar = Archive::new(gz);
1036 set_mask(&mut tar);
1037 tar
1038 };
1039 let mut bytes_written = 0;
1040 let prefix = unpack_dir.file_name().unwrap();
1041 let parent = unpack_dir.parent().unwrap();
1042 for entry in tar.entries()? {
1043 let mut entry = entry.context("failed to iterate over archive")?;
1044 let entry_path = entry
1045 .path()
1046 .context("failed to read entry path")?
1047 .into_owned();
1048
1049 if let Ok(path) = entry_path.strip_prefix(prefix) {
1050 if !include(path) {
1051 continue;
1052 }
1053 } else {
1054 // We're going to unpack this tarball into the global source
1055 // directory, but we want to make sure that it doesn't accidentally
1056 // (or maliciously) overwrite source code from other crates. Cargo
1057 // itself should never generate a tarball that hits this error, and
1058 // crates.io should also block uploads with these sorts of tarballs,
1059 // but be extra sure by adding a check here as well.
1060 anyhow::bail!(
1061 "invalid tarball downloaded, contains \
1062 a file at {entry_path:?} which isn't under {prefix:?}",
1063 )
1064 }
1065
1066 // Prevent unpacking the lockfile from the crate itself.
1067 if entry_path
1068 .file_name()
1069 .map_or(false, |p| p == PACKAGE_SOURCE_LOCK)
1070 {
1071 continue;
1072 }
1073 // Unpacking failed
1074 bytes_written += entry.size();
1075 let mut result = entry.unpack_in(parent).map_err(anyhow::Error::from);
1076 if cfg!(windows) && restricted_names::is_windows_reserved_path(&entry_path) {
1077 result = result.with_context(|| {
1078 format!(
1079 "`{}` appears to contain a reserved Windows path, \
1080 it cannot be extracted on Windows",
1081 entry_path.display()
1082 )
1083 });
1084 }
1085 result.with_context(|| format!("failed to unpack entry at `{}`", entry_path.display()))?;
1086 }
1087
1088 Ok(bytes_written)
1089}