diff options
105 files changed, 2356 insertions, 854 deletions
diff --git a/docs/markdown/Python-module.md b/docs/markdown/Python-module.md index 66081762d..8fa99fba7 100644 --- a/docs/markdown/Python-module.md +++ b/docs/markdown/Python-module.md @@ -130,6 +130,9 @@ the addition of the following: Additionally, the following diverge from [[shared_module]]'s default behavior: +- `install_dir` may only be a string, boolean, or unset, but an `array` is not + allowed. + - `gnu_symbol_visibility`: if unset, it will default to `'hidden'` on versions of Python that support this (the python headers define `PyMODINIT_FUNC` has default visibility). diff --git a/docs/markdown/Rust-module.md b/docs/markdown/Rust-module.md index 5ce0fdcdb..ee800441e 100644 --- a/docs/markdown/Rust-module.md +++ b/docs/markdown/Rust-module.md @@ -4,6 +4,9 @@ authors: - name: Dylan Baker email: dylan@pnwbakers.com years: [2020, 2021, 2022, 2024] + - name: Paolo Bonzini + email: bonzini@gnu.org + years: [2025] ... # Rust module @@ -13,8 +16,8 @@ authors: The rust module provides helper to integrate rust code into Meson. The goal is to make using rust in Meson more pleasant, while still -remaining mesonic, this means that it attempts to make Rust work more -like Meson, rather than Meson work more like rust. +remaining mesonic. Rust conventions are adopted in order to help the +Meson user and Rust developer, rather than to make Meson work more like rust. ## Functions @@ -168,3 +171,319 @@ Only a subset of [[shared_library]] keyword arguments are allowed: - link_depends - link_with - override_options + +### to_system_dependency() + +*Since 1.11.0* + +```meson +rustmod.to_system_dependency(dep[, name]) +``` + +Create and return an internal dependency that wraps `dep` and +defines `cfg(system_deps_have_NAME)`. This is compatible with +how the `system-deps` crate reports the availability of a system +dependency to Rust code. + +If omitted, the name defaults to the name of the dependency. + +### workspace() + +Basic usage: + +``` +cargo_ws = rustmod.workspace() +``` + +With custom features: + +``` +feature_list = get_feature('f1') ? ['feature1'] : [] +feature_list += get_feature('f2') ? ['feature2'] : [] +cargo_ws = rustmod.workspace(features: feature_list) +``` + +*Since 1.11.0* + +Create and return a `workspace` object for managing the project's Cargo +workspace. + +Keyword arguments: +- `default_features`: (`bool`, optional) Whether to enable default features. +- `features`: (`list[str]`, optional) List of additional features to enable globally. + +A project that wishes to use Cargo subprojects should have `Cargo.lock` and `Cargo.toml` +files in the root source directory, and should call this function before using +Cargo subprojects. + +The first invocation of `workspace()` establishes the *Cargo interpreter* +that resolves dependencies and features for both the toplevel project (the one +containing `Cargo.lock`) and all subprojects that are invoked with the `cargo` method, + +You can optionally customize the feature set, by providing `default_features` +and `features` when the Cargo interpreter is established. If any of these +arguments is not specified, `default_features` is taken as `true` and +`features` as the empty list. + +Once established, the Cargo interpreter's configuration is locked. Later calls to +`workspace()` must either omit all arguments (accepting the existing configuration) +or provide the same set of features as the first call. Mismatched arguments will cause +a build error. + +The recommendation is to not specify any keyword arguments in a subproject, so +that they simply inherit the parent's configuration. Be careful about the +difference between specifying arguments and not doing so: + +``` +# always works regardless of parent configuration +cargo_ws = rustmod.workspace() + +# fails if parent configured different features +cargo_ws = rustmod.workspace(default_features: true) +cargo_ws = rustmod.workspace(features: []) +``` + +The first form says "use whatever features are configured," while the latter forms +say "require this specific configuration," which may conflict with the parent project. + +## Workspace object + +### workspace.packages() + +```meson +packages = ws.packages() +``` + +Returns a list of package names in the workspace. + +### workspace.package() + +```meson +pkg = ws.package([package_name]) +``` + +Returns a package object for the given package member. If empty, returns +the object for the root package. + +Arguments: +- `package_name`: (str, optional) Name of the package; not needed for the + root package of a workspace + +Example usage: +```meson +rustmod = import('rust') +cargo_ws = rustmod.workspace() +pkg = cargo_ws.package() +pkg.executable(install: true) +``` + +### workspace.subproject() + +```meson +package = ws.subproject(package_name, api) +``` + +Returns a `package` object for managing a specific package within the workspace. + +Positional arguments: +- `package_name`: (`str`) The name of the package to retrieve +- `api`: (`str`, optional) The version constraints for the package in Cargo format + +## Package object + +The package object returned by `workspace.subproject()` provides methods +for working with individual packages in a Cargo workspace. + +### package.name(), subproject.name() + +```meson +name = pkg.name() +``` + +Returns the name of a package or subproject. + +### package.version(), subproject.version() + +```meson +version = pkg.version() +``` + +Returns the normalized version number of the subproject. + +### package.api(), subproject.api() + +```meson +api = pkg.api() +``` + +Returns the API version of the subproject, that is the version up to the first +nonzero element. + +### package.features(), subproject.features() + +```meson +features = pkg.features() +``` + +Returns selected features for a specific package or subproject. + +### package.all_features(), subproject.all_features() + +```meson +all_features = pkg.all_features() +``` + +Returns all defined features for a specific package or subproject. + +### Packages only + +Package objects are able to extract information from `Cargo.toml` files, +and provide methods to query how Cargo would build this package. They +also contain convenience wrappers for non-Rust-specific functions +(`executable`, `library`, `meson.override_dependency`, etc.), that +automatically add dependencies and compiler arguments from `Cargo.toml` +information. + +#### package.rust_args() + +```meson +args = pkg.rustc_args() +``` + +Returns rustc arguments for this package. + +#### package.env() + +```meson +env_vars = pkg.env() +``` + +Returns environment variables for this package. + +#### package.rust_dependency_map() + +```meson +dep_map = pkg.rust_dependency_map() +``` + +Returns rust dependency mapping for this package. + +#### package.dependencies() + +```meson +deps = pkg.dependencies(...) +``` + +Returns a list of dependency objects for all the dependencies required by this +Rust package, including both Rust crate dependencies and system dependencies. +The returned dependencies can be used directly in build target declarations. + +Keyword arguments: +- `dependencies`: (`bool`, default: true) Whether to include regular Rust crate dependencies +- `dev_dependencies`: (`bool`, default: false) Whether to include development dependencies (not yet implemented) +- `system_dependencies`: (`bool`, default: true) Whether to include system dependencies + +#### package.library() + +```meson +lib = pkg.library(...) +``` + +Builds library targets for a workspace package. The method requires that +the package's `Cargo.toml` file contains the `[lib]` section or that it +is discovered from the contents of the file system. Static vs. shared library is +decided based on the crate types in `Cargo.toml` + +Positional arguments: +- `target_name`: (`str`, optional) Name of the binary target to build. +- `sources`: (`StructuredSources`, optional) Source files for the executable. If omitted, + uses the path specified in the `[lib]` section of `Cargo.toml`. + +Accepts all keyword arguments from [[shared_library]] and [[static_library]]. +`rust_abi` must match the crate types and is mandatory if more than one +ABI is exposed by the crate. + +#### package.shared_module() + +```meson +lib = pkg.shared_module(...) +``` + +Builds the `cdylib` for a workspace package as a shared module. + +Accepts all keyword arguments from [[shared_module]]. + +#### package.proc_macro() + +```meson +lib = pkg.proc_macro(...) +``` + +Builds a proc-macro crate for a workspace package. + +Accepts all keyword arguments from [[shared_library]]. + +#### package.override_dependency() + +```meson +pkg.override_dependency(dep[, rust_abi: abi]) +``` + +Keyword arguments: +- `rust_abi`: (`str`, optional) The ABI to use for the dependency. Valid values are + `'rust'`, `'c'`, or `'proc-macro'`; the value must match the crate types and is + mandatory if more than one ABI is exposed by the crate. + +Make the crate available as a dependency to other crates. This is the same +as calling `meson.override_dependency`, but it computes the correct dependency +name from `pkg`'s name, API version and ABI (Rust vs. C). + +It is typically used with `library() or `proc_macro()`, for example: + +```meson +lib_pkg = cargo_ws.package('myproject-lib') +lib = lib_pkg.library(install: false) +lib_pkg.override_dependency(declare_dependency(link_with: lib)) + +# Declares myproject-lib as a dependency in Cargo.toml +exe_pkg = cargo_ws.package() +exe_pkg.executable(install: true) +``` + +#### package.executable() + +```meson +exe = pkg.executable([target_name], [sources], ...) +``` + +Builds an executable target for a workspace package. The method requires that the +package has at least one `bin` section defined in its `Cargo.toml` file, +or one binary discovered from the contents of the file system. + +Positional arguments: +- `target_name`: (`str`, optional) Name of the binary target to build. If the package + has multiple `bin` sections in `Cargo.toml`, this argument is required and must + match one of the binary names. If omitted and there's only one binary, that binary + will be built automatically. +- `sources`: (`StructuredSources`, optional) Source files for the executable. If omitted, + uses the path specified in the corresponding `bin` section of `Cargo.toml`. + +Accepts all keyword arguments from [[executable]]. + +### Subprojects only + +#### subproject.dependency() + +```meson +dep = subproject.dependency(...) +``` + +Returns a dependency object for the subproject that can be used with other Meson targets. + +*Note*: right now, this method is implemented on top of the normal Meson function +[[dependency]]; this is subject to change in future releases. It is recommended +to always retrieve a Cargo subproject's dependency object via this method. + +Keyword arguments: +- `rust_abi`: (`str`, optional) The ABI to use for the dependency. Valid values are + `'rust'`, `'c'`, or `'proc-macro'`. The package must support the specified ABI. diff --git a/docs/markdown/Rust.md b/docs/markdown/Rust.md index 69dbc038f..7b64944ef 100644 --- a/docs/markdown/Rust.md +++ b/docs/markdown/Rust.md @@ -113,3 +113,59 @@ target name. First, dashes, spaces and dots are replaced with underscores. Sec *since 1.10.0* anything after the first `+` is dropped. This allows creating multiple targets for the same crate name, for example when the same crate is built multiple times with different features, or for both the build and the host machine. + +## Cargo interaction + +*Since 1.11.0* + +In most cases, a Rust program will use Cargo to download crates. Meson is able +to build Rust library crates based on a `Cargo.toml` file; each external crate +corresponds to a subproject. Rust module's ` that do not need a `build.rs` file +need no intervention, whereas if a `build.rs` file is present it needs to be +converted manually to Meson code. + +To enable automatic configuration of Cargo dependencies, your project must +have `Cargo.toml` and `Cargo.lock` files in the root source directory; +this enables proper feature resolution across crates. You can then +create a workspace object using the Rust module, and retrieve specific +packages from the workspace: + +```meson +rust = import('rust') +cargo_ws = rustmod.workspace() +anyhow_dep = ws.subproject('anyhow').dependency() +``` + +The workspace object also enables configuration of Cargo features, for example +from Meson options: + +```meson +cargo_ws = rustmod.workspace( + features: ['feature1', 'feature2']) +``` + +Finally, the workspace object is able to build targets specified in `lib` +or `bin` sections, extracting compiler arguments for dependencies and +diagnostics from the Cargo.toml file. The simplest case is that of building +a simple binary crate: + +```meson +cargo_ws.package().executable(install: true) +``` + +For a workspace: + +```meson +pkg_lib = cargo_ws.package('myproject-lib') +lib = pkg_lib.library(install: false) +pkg_lib.override_dependency(declare_dependency(link_with: lib)) + +cargo_ws.package().executable(install: true) +``` + +Sources are automatically discovered, but can be specified as a +[[@structured_src]] if they are partly generated. + +It is still possible to use keyword arguments to link non-Rust build targets, +or even to use the usual Meson functions such as [[static_library]] or +[[executable]]. diff --git a/docs/markdown/Wrap-dependency-system-manual.md b/docs/markdown/Wrap-dependency-system-manual.md index 4a7250d3b..3e251d725 100644 --- a/docs/markdown/Wrap-dependency-system-manual.md +++ b/docs/markdown/Wrap-dependency-system-manual.md @@ -372,10 +372,7 @@ crates into a series of wraps definitions. Since *1.11.0* the overlay directory dependency name for `crates.io` URLs, and the final component of the URL (possibly with the `.git` suffix removed) for "git" URLs. -Since *1.10.0* Workspace Cargo.toml are supported. For the time being it is -recommended to regroup all Cargo dependencies inside a single workspace invoked -from the main Meson project. When invoking multiple different Cargo subprojects -from Meson, feature resolution of common dependencies might be wrong. +Since *1.10.0* Workspace Cargo.toml are supported. ## Using wrapped projects diff --git a/docs/markdown/snippets/cargo-workspace-object.md b/docs/markdown/snippets/cargo-workspace-object.md new file mode 100644 index 000000000..1ef767529 --- /dev/null +++ b/docs/markdown/snippets/cargo-workspace-object.md @@ -0,0 +1,14 @@ +## Cargo workspace object + +Meson now is able to parse the toplevel `Cargo.toml` file of the +project when the `workspace()` method of the Rust module is called. +This guarantees that features are resolved according to what is +in the `Cargo.toml` file, and in fact enables configuration of +features for the build. + +The returned object allows retrieving features and dependencies +for Cargo subprojects, and contains method to build targets +declared in `Cargo.toml` files. + +While Cargo subprojects remain experimental, the Meson project will +try to keep the workspace object reasonably backwards-compatible. diff --git a/docs/yaml/functions/_build_target_base.yaml b/docs/yaml/functions/_build_target_base.yaml index 112953387..1c5ee8f91 100644 --- a/docs/yaml/functions/_build_target_base.yaml +++ b/docs/yaml/functions/_build_target_base.yaml @@ -169,13 +169,17 @@ kwargs: description: When set to true, this executable should be installed. install_dir: - type: str + type: array[str | bool] description: | override install directory for this file. If the value is a relative path, it will be considered relative the `prefix` option. For example, if you want to install plugins into a subdir, you'd use something like this: `install_dir : get_option('libdir') / 'projectname-1.0'`. + This can be set to an array of values to control the the installation path + of build targets with multiple outputs. Currently, that means Vala. Setting + them to `true` means "use the default", and `false` means, don't install. + install_mode: type: array[str | int] since: 0.47.0 diff --git a/docs/yaml/functions/install_headers.yaml b/docs/yaml/functions/install_headers.yaml index 42f64624b..30332b3ea 100644 --- a/docs/yaml/functions/install_headers.yaml +++ b/docs/yaml/functions/install_headers.yaml @@ -11,7 +11,7 @@ description: | Please note that this can only install static files from the source tree. Generated files are installed via the `install_dir:` kwarg on the respective - generators, such as `custom_target()` or `configure_file(). + generators, such as `custom_target()` or `configure_file()`. example: | For example, this will install `common.h` and `kola.h` into diff --git a/docs/yaml/functions/summary.yaml b/docs/yaml/functions/summary.yaml index b5312821e..f3b897540 100644 --- a/docs/yaml/functions/summary.yaml +++ b/docs/yaml/functions/summary.yaml @@ -47,18 +47,17 @@ example: | My Project 1.0 Directories - prefix : /opt/gnome bindir : bin libdir : lib/x86_64-linux-gnu datadir : share Configuration - Some boolean : False - Another boolean: True + Some boolean : false + Another boolean: true Some string : Hello World - An array : string + An array : string 1 - True + true ``` arg_flattening: false diff --git a/mesonbuild/ast/printer.py b/mesonbuild/ast/printer.py index 024b62b9e..60aeef100 100644 --- a/mesonbuild/ast/printer.py +++ b/mesonbuild/ast/printer.py @@ -31,7 +31,7 @@ def precedence_level(node: mparser.BaseNode) -> int: return 6 elif isinstance(node, (mparser.NotNode, mparser.UMinusNode)): return 7 - elif isinstance(node, mparser.FunctionNode): + elif isinstance(node, (mparser.FunctionNode, mparser.IndexNode, mparser.MethodNode)): return 8 elif isinstance(node, (mparser.ArrayNode, mparser.DictNode)): return 9 diff --git a/mesonbuild/backend/backends.py b/mesonbuild/backend/backends.py index dddcf67d8..21f90661f 100644 --- a/mesonbuild/backend/backends.py +++ b/mesonbuild/backend/backends.py @@ -494,7 +494,7 @@ class Backend: for obj in objects: if isinstance(obj, str): o = os.path.join(proj_dir_to_build_root, - self.build_to_src, target.get_builddir(), obj) + self.build_to_src, target.get_subdir(), obj) obj_list.append(o) elif isinstance(obj, mesonlib.File): if obj.is_built: @@ -1458,9 +1458,9 @@ class Backend: deps.append(i.rel_to_builddir(self.build_to_src)) else: if absolute_paths: - deps.append(os.path.join(self.environment.get_source_dir(), target.subdir, i)) + deps.append(os.path.join(self.environment.get_source_dir(), target.get_subdir(), i)) else: - deps.append(os.path.join(self.build_to_src, target.subdir, i)) + deps.append(os.path.join(self.build_to_src, target.get_subdir(), i)) return deps def get_custom_target_output_dir(self, target: build.AnyTargetType) -> str: @@ -1540,7 +1540,7 @@ class Backend: if '@BUILD_ROOT@' in i: i = i.replace('@BUILD_ROOT@', build_root) if '@CURRENT_SOURCE_DIR@' in i: - i = i.replace('@CURRENT_SOURCE_DIR@', os.path.join(source_root, target.subdir)) + i = i.replace('@CURRENT_SOURCE_DIR@', os.path.join(source_root, target.get_subdir())) if '@DEPFILE@' in i: if target.depfile is None: msg = f'Custom target {target.name!r} has @DEPFILE@ but no depfile ' \ @@ -1601,7 +1601,7 @@ class Backend: if target.default_env: env.set('MESON_SOURCE_ROOT', [self.environment.get_source_dir()]) env.set('MESON_BUILD_ROOT', [self.environment.get_build_dir()]) - env.set('MESON_SUBDIR', [target.subdir]) + env.set('MESON_SUBDIR', [target.get_subdir()]) env.set('MESONINTROSPECT', [self.get_introspect_command()]) return env @@ -1690,10 +1690,10 @@ class Backend: # Sanity-check the outputs and install_dirs num_outdirs, num_out = len(outdirs), len(t.get_outputs()) if num_outdirs not in {1, num_out}: - m = 'Target {!r} has {} outputs: {!r}, but only {} "install_dir"s were found.\n' \ + m = 'Target {!r} has {} outputs: {!r}, but {} "install_dir"s were found: {!r}.\n' \ "Pass 'false' for outputs that should not be installed and 'true' for\n" \ 'using the default installation directory for an output.' - raise MesonException(m.format(t.name, num_out, t.get_outputs(), num_outdirs)) + raise MesonException(m.format(t.name, num_out, t.get_outputs(), num_outdirs, outdirs)) assert len(t.install_tag) == num_out install_mode = t.get_custom_install_mode() # because mypy gets confused type narrowing in lists @@ -1938,7 +1938,7 @@ class Backend: elif isinstance(j, str): source_list += [os.path.join(self.source_dir, j)] elif isinstance(j, (build.CustomTarget, build.BuildTarget)): - source_list += [os.path.join(self.build_dir, j.get_subdir(), o) for o in j.get_outputs()] + source_list += [os.path.join(self.build_dir, j.get_builddir(), o) for o in j.get_outputs()] source_list = [os.path.normpath(s) for s in source_list] compiler: T.List[str] = [] diff --git a/mesonbuild/backend/ninjabackend.py b/mesonbuild/backend/ninjabackend.py index 6dbb35144..f6139127d 100644 --- a/mesonbuild/backend/ninjabackend.py +++ b/mesonbuild/backend/ninjabackend.py @@ -1866,7 +1866,7 @@ class NinjaBackend(backends.Backend): if isinstance(gen, GeneratedList): ssrc = os.path.join(self.get_target_private_dir(target), ssrc) else: - ssrc = os.path.join(gen.get_subdir(), ssrc) + ssrc = os.path.join(gen.get_builddir(), ssrc) if ssrc.endswith('.pyx'): output = os.path.join(self.get_target_private_dir(target), f'{ssrc}.{ext}') element = NinjaBuildElement( @@ -1879,7 +1879,7 @@ class NinjaBackend(backends.Backend): # TODO: introspection? cython_sources.append(output) else: - generated_sources[ssrc] = mesonlib.File.from_built_file(gen.get_subdir(), ssrc) + generated_sources[ssrc] = mesonlib.File.from_built_file(gen.get_builddir(), ssrc) # Following logic in L883-900 where we determine whether to add generated source # as a header(order-only) dep to the .so compilation rule if not compilers.is_source(ssrc) and \ @@ -2001,7 +2001,7 @@ class NinjaBackend(backends.Backend): else: for h in g.get_outputs(): if h.endswith('.rs'): - main_rust_file = os.path.join(g.get_subdir(), h) + main_rust_file = os.path.join(g.get_builddir(), h) break if main_rust_file is not None: break @@ -2025,7 +2025,7 @@ class NinjaBackend(backends.Backend): if isinstance(g, GeneratedList): fname = os.path.join(self.get_target_private_dir(target), i) else: - fname = os.path.join(g.get_subdir(), i) + fname = os.path.join(g.get_builddir(), i) if main_rust_file is None and fname.endswith('.rs'): main_rust_file = fname orderdeps.append(fname) @@ -3415,10 +3415,10 @@ https://gcc.gnu.org/bugzilla/show_bug.cgi?id=47485''')) if pch[1] is None: # Auto generate PCH. source = self.create_msvc_pch_implementation(target, compiler.get_language(), pch[0]) - pch_header_dir = os.path.dirname(os.path.join(self.build_to_src, target.get_source_subdir(), header)) + pch_header_dir = os.path.dirname(os.path.join(self.build_to_src, target.get_subdir(), header)) commands += compiler.get_include_args(pch_header_dir, False) else: - source = os.path.join(self.build_to_src, target.get_source_subdir(), pch[1]) + source = os.path.join(self.build_to_src, target.get_subdir(), pch[1]) just_name = os.path.basename(header) (objname, pch_args) = compiler.gen_pch_args(just_name, source, dst) @@ -3459,16 +3459,16 @@ https://gcc.gnu.org/bugzilla/show_bug.cgi?id=47485''')) compiler: Compiler = target.compilers[lang] if compiler.get_argument_syntax() == 'msvc': (commands, dep, dst, objs, src) = self.generate_msvc_pch_command(target, compiler, pch) - extradep = os.path.join(self.build_to_src, target.get_source_subdir(), pch[0]) + extradep = os.path.join(self.build_to_src, target.get_subdir(), pch[0]) elif compiler.id == 'intel': # Intel generates on target generation continue elif 'mwcc' in compiler.id: - src = os.path.join(self.build_to_src, target.get_source_subdir(), pch[0]) + src = os.path.join(self.build_to_src, target.get_subdir(), pch[0]) (commands, dep, dst, objs) = self.generate_mwcc_pch_command(target, compiler, pch[0]) extradep = None else: - src = os.path.join(self.build_to_src, target.get_source_subdir(), pch[0]) + src = os.path.join(self.build_to_src, target.get_subdir(), pch[0]) (commands, dep, dst, objs) = self.generate_gcc_pch_command(target, compiler, pch[0]) extradep = None pch_objects += objs diff --git a/mesonbuild/backend/vs2010backend.py b/mesonbuild/backend/vs2010backend.py index cc7ce1af9..1c33a2116 100644 --- a/mesonbuild/backend/vs2010backend.py +++ b/mesonbuild/backend/vs2010backend.py @@ -1787,7 +1787,7 @@ class Vs2010Backend(backends.Backend): self.add_additional_options(lang, inc_cl, file_args) self.add_preprocessor_defines(lang, inc_cl, file_defines) self.add_include_dirs(lang, inc_cl, file_inc_dirs) - s = File.from_built_file(target.get_subdir(), s) + s = File.from_built_file(target.get_builddir(), s) ET.SubElement(inc_cl, 'ObjectFileName').text = "$(IntDir)" + \ self.object_filename_from_source(target, compiler, s) for lang, headers in pch_sources.items(): diff --git a/mesonbuild/backend/xcodebackend.py b/mesonbuild/backend/xcodebackend.py index b4f43f338..3da7bfe42 100644 --- a/mesonbuild/backend/xcodebackend.py +++ b/mesonbuild/backend/xcodebackend.py @@ -1813,7 +1813,7 @@ class XCodeBackend(backends.Backend): settings_dict.add_item('GCC_SYMBOLS_PRIVATE_EXTERN', 'NO') unquoted_headers = [self.get_target_private_dir_abs(target)] if target.implicit_include_directories: - unquoted_headers.append(os.path.join(self.environment.get_build_dir(), target.get_subdir())) + unquoted_headers.append(os.path.join(self.environment.get_build_dir(), target.get_builddir())) unquoted_headers.append(os.path.join(self.environment.get_source_dir(), target.get_subdir())) unquoted_headers += headerdirs settings_dict.add_item('HEADER_SEARCH_PATHS', self.normalize_header_search_paths(unquoted_headers)) diff --git a/mesonbuild/build.py b/mesonbuild/build.py index 906f55289..2ae47a858 100644 --- a/mesonbuild/build.py +++ b/mesonbuild/build.py @@ -22,7 +22,7 @@ from . import programs from .mesonlib import ( HoldableObject, SecondLevelHolder, File, MesonException, MachineChoice, PerMachine, OrderedSet, listify, - extract_as_list, typeslistify, classify_unity_sources, + classify_unity_sources, get_filenames_templates_dict, substitute_values, has_path_sep, is_parent_path, relpath, PerMachineDefaultable, MesonBugException, EnvironmentVariables, pickle_load, lazy_property, @@ -52,6 +52,7 @@ if T.TYPE_CHECKING: GeneratedTypes: TypeAlias = T.Union['CustomTarget', 'CustomTargetIndex', 'GeneratedList'] LibTypes: TypeAlias = T.Union['SharedLibrary', 'StaticLibrary', 'CustomTarget', 'CustomTargetIndex'] BuildTargetTypes: TypeAlias = T.Union['BuildTarget', 'CustomTarget', 'CustomTargetIndex'] + StaticTargetTypes: TypeAlias = T.Union['StaticLibrary', 'CustomTarget', 'CustomTargetIndex'] ObjectTypes: TypeAlias = T.Union[str, 'File', 'ExtractedObjects', 'GeneratedTypes'] AnyTargetType: TypeAlias = T.Union['Target', 'CustomTargetIndex'] RustCrateType: TypeAlias = Literal['bin', 'lib', 'rlib', 'dylib', 'cdylib', 'staticlib', 'proc-macro'] @@ -90,7 +91,7 @@ if T.TYPE_CHECKING: link_args: T.List[str] link_depends: T.List[T.Union[str, File, CustomTarget, CustomTargetIndex]] link_language: str - link_whole: T.List[T.Union[StaticLibrary, CustomTarget, CustomTargetIndex]] + link_whole: T.List[StaticTargetTypes] link_with: T.List[BuildTargetTypes] name_prefix: T.Optional[str] name_suffix: T.Optional[str] @@ -795,8 +796,8 @@ class BuildTarget(Target): self.external_deps: T.List[dependencies.Dependency] = [] self.include_dirs: T.List['IncludeDirs'] = [] self.link_language = kwargs.get('link_language') - self.link_targets: T.List[LibTypes] = [] - self.link_whole_targets: T.List[T.Union[StaticLibrary, CustomTarget, CustomTargetIndex]] = [] + self.link_targets: T.List[BuildTargetTypes] = [] + self.link_whole_targets: T.List[StaticTargetTypes] = [] self.depend_files: T.List[File] = [] self.link_depends: T.List[T.Union[File, BuildTargetTypes]] = [] self.added_deps = set() @@ -1130,7 +1131,7 @@ class BuildTarget(Target): self.link_depends.append(s) elif isinstance(s, str): self.link_depends.append( - File.from_source_file(self.environment.source_dir, self.subdir, s)) + File.from_source_file(self.environment.source_dir, self.get_subdir(), s)) else: self.link_depends.append(s) @@ -1245,7 +1246,7 @@ class BuildTarget(Target): result: OrderedSet[str] = OrderedSet() for i in self.link_targets: if not isinstance(i, StaticLibrary): - result.add(i.get_subdir()) + result.add(i.get_builddir()) result.update(i.get_link_dep_subdirs()) return result @@ -1285,15 +1286,12 @@ class BuildTarget(Target): self.process_link_depends(kwargs.get('link_depends', [])) # Target-specific include dirs must be added BEFORE include dirs from # internal deps (added inside self.add_deps()) to override them. - inclist = extract_as_list(kwargs, 'include_directories') - self.add_include_dirs(inclist) + self.add_include_dirs(kwargs.get('include_directories', [])) # Add dependencies (which also have include_directories) - deplist = extract_as_list(kwargs, 'dependencies') - self.add_deps(deplist) + self.add_deps(kwargs.get('dependencies', [])) # If an item in this list is False, the output corresponding to # the list index of that item will not be installed - self.install_dir = typeslistify(kwargs.get('install_dir', []), - (str, bool)) + self.install_dir = kwargs.get('install_dir', []) self.install_mode = kwargs.get('install_mode', None) self.install_tag: T.List[T.Optional[str]] = kwargs.get('install_tag') or [None] self.extra_files = kwargs.get('extra_files', []) @@ -1402,9 +1400,6 @@ class BuildTarget(Target): for t in self.link_whole_targets: t.get_dependencies_recurse(result, include_internals, include_proc_macros) - def get_source_subdir(self): - return self.subdir - def get_sources(self) -> T.List[File]: return self.sources @@ -1423,8 +1418,7 @@ class BuildTarget(Target): def get_include_dirs(self) -> T.List['IncludeDirs']: return self.include_dirs - def add_deps(self, deps): - deps = listify(deps) + def add_deps(self, deps: T.List[dependencies.Dependency]) -> None: for dep in deps: if dep in self.added_deps: # Prefer to add dependencies to added_deps which have a name @@ -1443,38 +1437,18 @@ class BuildTarget(Target): self.link_whole_targets.extend(dep.whole_libraries) if dep.get_compile_args() or dep.get_link_args(): # Those parts that are external. - extpart = dependencies.InternalDependency('undefined', - [], - dep.get_compile_args(), - dep.get_link_args(), - [], [], [], [], [], {}, [], [], [], - dep.name) + extpart = type(dep)(dep.version, + compile_args=dep.get_compile_args(), + link_args=dep.get_link_args(), + name=dep.name) self.external_deps.append(extpart) # Deps of deps. self.add_deps(dep.ext_deps) - elif isinstance(dep, dependencies.Dependency): + else: if dep not in self.external_deps: self.external_deps.append(dep) self.process_sourcelist(dep.get_sources()) self.add_deps(dep.ext_deps) - elif isinstance(dep, BuildTarget): - raise InvalidArguments(f'Tried to use a build target {dep.name} as a dependency of target {self.name}.\n' - 'You probably should put it in link_with instead.') - else: - # This is a bit of a hack. We do not want Build to know anything - # about the interpreter so we can't import it and use isinstance. - # This should be reliable enough. - if hasattr(dep, 'held_object'): - # FIXME: subproject is not a real ObjectHolder so we have to do this by hand - dep = dep.held_object - if hasattr(dep, 'project_args_frozen') or hasattr(dep, 'global_args_frozen'): - raise InvalidArguments('Tried to use subproject object as a dependency.\n' - 'You probably wanted to use a dependency declared in it instead.\n' - 'Access it by calling get_variable() on the subproject object.') - raise InvalidArguments(f'Argument is of an unacceptable type {type(dep).__name__!r}.\nMust be ' - 'either an external dependency (returned by find_library() or ' - 'dependency()) or an internal dependency (returned by ' - 'declare_dependency()).') dep_d_features = dep.d_features @@ -1492,62 +1466,24 @@ class BuildTarget(Target): def link(self, targets: T.List[BuildTargetTypes]) -> None: for t in targets: - if not isinstance(t, (Target, CustomTargetIndex)): - if isinstance(t, dependencies.ExternalLibrary): - raise MesonException(textwrap.dedent('''\ - An external library was used in link_with keyword argument, which - is reserved for libraries built as part of this project. External - libraries must be passed using the dependencies keyword argument - instead, because they are conceptually "external dependencies", - just like those detected with the dependency() function. - ''')) - raise InvalidArguments(f'{t!r} is not a target.') - if not t.is_linkable_target(): - raise InvalidArguments(f"Link target '{t!s}' is not linkable.") - if isinstance(self, StaticLibrary) and self.install and t.is_internal(): - # When we're a static library and we link_with to an - # internal/convenience library, promote to link_whole. - self.link_whole([t], promoted=True) - continue - if isinstance(self, SharedLibrary) and isinstance(t, StaticLibrary) and not t.pic: - msg = f"Can't link non-PIC static library {t.name!r} into shared library {self.name!r}. " - msg += "Use the 'pic' option to static_library to build with PIC." - raise InvalidArguments(msg) self.check_can_link_together(t) self.link_targets.append(t) - def link_whole(self, targets: T.List[BuildTargetTypes], promoted: bool = False) -> None: + def link_whole( + self, + targets: T.List[StaticTargetTypes], + promoted: bool = False) -> None: for t in targets: - if isinstance(t, (CustomTarget, CustomTargetIndex)): - if not t.is_linkable_target(): - raise InvalidArguments(f'Custom target {t!r} is not linkable.') - if t.links_dynamically(): - raise InvalidArguments('Can only link_whole custom targets that are static archives.') - elif not isinstance(t, StaticLibrary): - raise InvalidArguments(f'{t!r} is not a static library.') - elif isinstance(self, SharedLibrary) and not t.pic: - msg = f"Can't link non-PIC static library {t.name!r} into shared library {self.name!r}. " - msg += "Use the 'pic' option to static_library to build with PIC." - raise InvalidArguments(msg) self.check_can_link_together(t) - if isinstance(self, StaticLibrary): - # When we're a static library and we link_whole: to another static - # library, we need to add that target's objects to ourselves. - self._bundle_static_library(t, promoted) - # If we install this static library we also need to include objects - # from all uninstalled static libraries it depends on. - if self.install: - for lib in t.get_internal_static_libraries(): - self._bundle_static_library(lib, True) self.link_whole_targets.append(t) @lru_cache(maxsize=None) - def get_internal_static_libraries(self) -> OrderedSet[BuildTargetTypes]: - result: OrderedSet[BuildTargetTypes] = OrderedSet() + def get_internal_static_libraries(self) -> OrderedSet[StaticTargetTypes]: + result: OrderedSet[StaticTargetTypes] = OrderedSet() self.get_internal_static_libraries_recurse(result) return result - def get_internal_static_libraries_recurse(self, result: OrderedSet[BuildTargetTypes]) -> None: + def get_internal_static_libraries_recurse(self, result: OrderedSet[StaticTargetTypes]) -> None: for t in self.link_targets: if t.is_internal() and t not in result: result.add(t) @@ -1556,28 +1492,6 @@ class BuildTarget(Target): if t.is_internal(): t.get_internal_static_libraries_recurse(result) - def _bundle_static_library(self, t: T.Union[BuildTargetTypes], promoted: bool = False) -> None: - if self.uses_rust(): - # Rustc can bundle static libraries, no need to extract objects. - self.link_whole_targets.append(t) - elif isinstance(t, (CustomTarget, CustomTargetIndex)) or t.uses_rust(): - # To extract objects from a custom target we would have to extract - # the archive, WIP implementation can be found in - # https://github.com/mesonbuild/meson/pull/9218. - # For Rust C ABI we could in theory have access to objects, but there - # are several meson issues that need to be fixed: - # https://github.com/mesonbuild/meson/issues/10722 - # https://github.com/mesonbuild/meson/issues/10723 - # https://github.com/mesonbuild/meson/issues/10724 - m = (f'Cannot link_whole a custom or Rust target {t.name!r} into a static library {self.name!r}. ' - 'Instead, pass individual object files with the "objects:" keyword argument if possible.') - if promoted: - m += (f' Meson had to promote link to link_whole because {self.name!r} is installed but not {t.name!r},' - f' and thus has to include objects from {t.name!r} to be usable.') - raise InvalidArguments(m) - else: - self.objects.append(t.extract_all_objects()) - def check_can_link_together(self, t: BuildTargetTypes) -> None: links_with_rust_abi = isinstance(t, BuildTarget) and t.uses_rust_abi() if not self.uses_rust() and links_with_rust_abi: @@ -1589,18 +1503,12 @@ class BuildTarget(Target): else: mlog.warning(msg + ' This will fail in cross build.') - def add_include_dirs(self, args: T.Sequence['IncludeDirs'], set_is_system: T.Optional[str] = None) -> None: - ids: T.List['IncludeDirs'] = [] - for a in args: - if not isinstance(a, IncludeDirs): - raise InvalidArguments('Include directory to be added is not an include directory object.') - ids.append(a) - if set_is_system is None: - set_is_system = 'preserve' + def add_include_dirs(self, args: T.Sequence['IncludeDirs'], set_is_system: str = 'preserve') -> None: if set_is_system != 'preserve': is_system = set_is_system == 'system' - ids = [IncludeDirs(x.get_curdir(), x.get_incdirs(), is_system, x.get_extra_build_dirs()) for x in ids] - self.include_dirs += ids + self.include_dirs.extend([IncludeDirs(x.get_curdir(), x.get_incdirs(), is_system, x.get_extra_build_dirs()) for x in args]) + else: + self.include_dirs.extend(args) def get_aliases(self) -> T.List[T.Tuple[str, str, str]]: return [] @@ -1793,7 +1701,7 @@ class BuildTarget(Target): self.vs_module_defs = path else: # When passing output of a Custom Target - self.vs_module_defs = File.from_built_file(path.get_subdir(), path.get_filename()) + self.vs_module_defs = File.from_built_file(path.get_builddir(), path.get_filename()) self.process_link_depends([path]) def extract_targets_as_list(self, kwargs: BuildTargetKeywordArguments, key: T.Literal['link_with', 'link_whole']) -> T.List[LibTypes]: @@ -2038,7 +1946,7 @@ class Generator(HoldableObject): for e in files: if isinstance(e, (CustomTarget, CustomTargetIndex)): output.depends.add(e) - fs = [File.from_built_file(e.get_subdir(), f) for f in e.get_outputs()] + fs = [File.from_built_file(e.get_builddir(), f) for f in e.get_outputs()] elif isinstance(e, GeneratedList): if preserve_path_from: raise InvalidArguments("generator.process: 'preserve_path_from' is not allowed if one input is a 'generated_list'.") @@ -2409,6 +2317,54 @@ class StaticLibrary(BuildTarget): result.link_targets = [t.get(lib_type, True) for t in self.link_targets] return result + def link_whole( + self, + targets: T.List[StaticTargetTypes], + promoted: bool = False) -> None: + for t in targets: + self.check_can_link_together(t) + + # When we're a static library and we link_whole: to another static + # library, we need to add that target's objects to ourselves. + self._bundle_static_library(t, promoted) + + # If we install this static library we also need to include objects + # from all uninstalled static libraries it depends on. + if self.install: + for lib in t.get_internal_static_libraries(): + self._bundle_static_library(lib, True) + self.link_whole_targets.append(t) + + def link(self, targets: T.List[BuildTargetTypes]) -> None: + for t in targets: + if self.install and t.is_internal(): + # When we're a static library and we link_with to an + # internal/convenience library, promote to link_whole. + self.link_whole([t], promoted=True) + continue + self.check_can_link_together(t) + self.link_targets.append(t) + + def _bundle_static_library(self, t: StaticTargetTypes, promoted: bool = False) -> None: + if self.uses_rust(): + # Rustc can bundle static libraries, no need to extract objects. + self.link_whole_targets.append(t) + elif isinstance(t, (CustomTarget, CustomTargetIndex)) or t.uses_rust(): + # To extract objects from a custom target we would have to extract + # the archive, WIP implementation can be found in + # https://github.com/mesonbuild/meson/pull/9218. + # For Rust C ABI we could in theory have access to objects, but we + # don't currently build them in such a way that this is possible: + # https://github.com/mesonbuild/meson/issues/10724 + m = (f'Cannot link_whole a custom or Rust target {t.name!r} into a static library {self.name!r}. ' + 'Instead, pass individual object files with the "objects:" keyword argument if possible.') + if promoted: + m += (f' Meson had to promote link to link_whole because {self.name!r} is installed but not {t.name!r},' + f' and thus has to include objects from {t.name!r} to be usable.') + raise InvalidArguments(m) + else: + self.objects.append(t.extract_all_objects()) + class SharedLibrary(BuildTarget): known_kwargs = known_shlib_kwargs @@ -2713,6 +2669,27 @@ class SharedLibrary(BuildTarget): result.link_targets = [t.get(lib_type, True) for t in self.link_targets] return result + def link_whole( + self, + targets: T.List[StaticTargetTypes], + promoted: bool = False) -> None: + for t in targets: + self.check_can_link_together(t) + if not getattr(t, 'pic', True): + msg = f"Can't link non-PIC static library {t.name!r} into shared library {self.name!r}. " + msg += "Use the 'pic' option to static_library to build with PIC." + raise InvalidArguments(msg) + self.link_whole_targets.append(t) + + def link(self, targets: T.List[BuildTargetTypes]) -> None: + for t in targets: + if isinstance(t, StaticLibrary) and not t.pic: + msg = f"Can't link non-PIC static library {t.name!r} into shared library {self.name!r}. " + msg += "Use the 'pic' option to static_library to build with PIC." + raise InvalidArguments(msg) + self.check_can_link_together(t) + self.link_targets.append(t) + # A shared library that is meant to be used with dlopen rather than linking # into something else. class SharedModule(SharedLibrary): @@ -2775,6 +2752,10 @@ class BothLibraries(SecondLevelHolder): def get_id(self) -> str: return self.get_default_object().get_id() + def is_linkable_target(self) -> bool: + # For polymorphism with build targets + return True + class CommandBase: depend_files: T.List[File] @@ -2827,10 +2808,10 @@ class CustomTargetBase: def get_dependencies_recurse(self, result: OrderedSet[BuildTargetTypes], include_internals: bool = True) -> None: pass - def get_internal_static_libraries(self) -> OrderedSet[BuildTargetTypes]: + def get_internal_static_libraries(self) -> OrderedSet[StaticTargetTypes]: return OrderedSet() - def get_internal_static_libraries_recurse(self, result: OrderedSet[BuildTargetTypes]) -> None: + def get_internal_static_libraries_recurse(self, result: OrderedSet[StaticTargetTypes]) -> None: pass def get_all_linked_targets(self) -> ImmutableListProtocol[BuildTargetTypes]: diff --git a/mesonbuild/cargo/__init__.py b/mesonbuild/cargo/__init__.py index c5b157f3c..65e018a9d 100644 --- a/mesonbuild/cargo/__init__.py +++ b/mesonbuild/cargo/__init__.py @@ -1,7 +1,9 @@ __all__ = [ 'Interpreter', + 'PackageState', 'TomlImplementationMissing', + 'WorkspaceState', ] -from .interpreter import Interpreter +from .interpreter import Interpreter, PackageState, WorkspaceState from .toml import TomlImplementationMissing diff --git a/mesonbuild/cargo/interpreter.py b/mesonbuild/cargo/interpreter.py index e26298641..5b003c6b4 100644 --- a/mesonbuild/cargo/interpreter.py +++ b/mesonbuild/cargo/interpreter.py @@ -12,6 +12,7 @@ port will be required. from __future__ import annotations import dataclasses import functools +import itertools import os import pathlib import collections @@ -23,38 +24,29 @@ from . import builder, version from .cfg import eval_cfg from .toml import load_toml from .manifest import Manifest, CargoLock, CargoLockPackage, Workspace, fixup_meson_varname -from ..mesonlib import is_parent_path, MesonException, MachineChoice, version_compare +from ..mesonlib import ( + is_parent_path, lazy_property, MesonException, MachineChoice, + unique_list, version_compare) from .. import coredata, mlog from ..wrap.wrap import PackageDefinition if T.TYPE_CHECKING: from . import raw from .. import mparser - from .manifest import Dependency, SystemDependency + from typing_extensions import Literal + + from .manifest import Dependency from ..environment import Environment from ..interpreterbase import SubProject from ..compilers.rust import RustCompiler - from typing_extensions import Literal + RUST_ABI = Literal['rust', 'c', 'proc-macro'] def _dependency_name(package_name: str, api: str, suffix: str = '-rs') -> str: basename = package_name[:-len(suffix)] if suffix and package_name.endswith(suffix) else package_name return f'{basename}-{api}{suffix}' -def _dependency_varname(dep: Dependency) -> str: - return f'{fixup_meson_varname(dep.package)}_{(dep.api.replace(".", "_"))}_dep' - - -def _library_name(name: str, api: str, lib_type: Literal['rust', 'c', 'proc-macro'] = 'rust') -> str: - # Add the API version to the library name to avoid conflicts when multiple - # versions of the same crate are used. The Ninja backend removed everything - # after the + to form the crate name. - if lib_type == 'c': - return name - return f'{name}+{api.replace(".", "_")}' - - def _extra_args_varname() -> str: return 'extra_args' @@ -86,7 +78,7 @@ class PackageConfiguration: dep = manifest.dependencies[name] dep_key = PackageKey(dep.package, dep.api) dep_pkg = self.dep_packages[dep_key] - dep_lib_name = _library_name(dep_pkg.manifest.lib.name, dep_pkg.manifest.package.api) + dep_lib_name = dep_pkg.library_name() dep_crate_name = name if name != dep.package else dep_pkg.manifest.lib.name dependency_map[dep_lib_name] = dep_crate_name return dependency_map @@ -102,6 +94,21 @@ class PackageState: # Package configuration state cfg: T.Optional[PackageConfiguration] = None + @lazy_property + def path(self) -> T.Optional[str]: + if not self.ws_subdir: + return None + return os.path.normpath(os.path.join(self.ws_subdir, self.ws_member)) + + def library_name(self, lib_type: RUST_ABI = 'rust') -> str: + # Add the API version to the library name to avoid conflicts when multiple + # versions of the same crate are used. The Ninja backend removed everything + # after the + to form the crate name. + name = fixup_meson_varname(self.manifest.package.name) + if lib_type == 'c': + return name + return f'{name}+{self.manifest.package.api.replace(".", "_")}' + def get_env_dict(self, environment: Environment, subdir: str) -> T.Dict[str, str]: """Get environment variables for this package.""" # Common variables for build.rs and crates @@ -144,6 +151,8 @@ class PackageState: args.extend(lint.to_arguments(has_check_cfg)) if has_check_cfg: + args.append('--check-cfg') + args.append('cfg(test)') for feature in self.manifest.features: if feature != 'default': args.append('--check-cfg') @@ -181,6 +190,58 @@ class PackageState: args.extend(self.get_env_args(rustc, environment, subdir)) return args + def supported_abis(self) -> T.Set[RUST_ABI]: + """Return which ABIs are exposed by the package's crate_types.""" + crate_types = self.manifest.lib.crate_type + abis: T.Set[RUST_ABI] = set() + if any(ct in {'lib', 'rlib', 'dylib'} for ct in crate_types): + abis.add('rust') + if any(ct in {'staticlib', 'cdylib'} for ct in crate_types): + abis.add('c') + if 'proc-macro' in crate_types: + abis.add('proc-macro') + return abis + + def get_subproject_name(self) -> str: + return _dependency_name(self.manifest.package.name, self.manifest.package.api) + + def abi_resolve_default(self, rust_abi: T.Optional[RUST_ABI]) -> RUST_ABI: + supported_abis = self.supported_abis() + if rust_abi is None: + if len(supported_abis) > 1: + raise MesonException(f'Package {self.manifest.package.name} support more than one ABI') + return next(iter(supported_abis)) + else: + if rust_abi not in supported_abis: + raise MesonException(f'Package {self.manifest.package.name} does not support ABI {rust_abi}') + return rust_abi + + def abi_has_shared(self, rust_abi: RUST_ABI) -> bool: + if rust_abi == 'proc-macro': + return True + return ('cdylib' if rust_abi == 'c' else 'dylib') in self.manifest.lib.crate_type + + def abi_has_static(self, rust_abi: RUST_ABI) -> bool: + if rust_abi == 'proc-macro': + return False + crate_type = self.manifest.lib.crate_type + if rust_abi == 'c': + return 'staticlib' in crate_type + return 'lib' in crate_type or 'rlib' in crate_type + + def get_dependency_name(self, rust_abi: T.Optional[RUST_ABI]) -> str: + """Get the dependency name for a package with the given ABI.""" + rust_abi = self.abi_resolve_default(rust_abi) + package_name = self.manifest.package.name + api = self.manifest.package.api + + if rust_abi in {'rust', 'proc-macro'}: + return _dependency_name(package_name, api) + elif rust_abi == 'c': + return _dependency_name(package_name, api, '') + else: + raise MesonException(f'Unknown rust_abi: {rust_abi}') + @dataclasses.dataclass(frozen=True) class PackageKey: @@ -202,6 +263,8 @@ class WorkspaceState: class Interpreter: + _features: T.Optional[T.List[str]] = None + def __init__(self, env: Environment, subdir: str, subprojects_dir: str) -> None: self.environment = env self.subprojects_dir = subprojects_dir @@ -221,28 +284,66 @@ class Interpreter: self.environment.wrap_resolver.merge_wraps(self.cargolock.wraps) self.build_def_files.append(filename) + @property + def features(self) -> T.List[str]: + """Get the features list. Once read, it cannot be modified.""" + if self._features is None: + self._features = ['default'] + return self._features + + @features.setter + def features(self, value: T.List[str]) -> None: + """Set the features list. Can only be set before first read.""" + value_unique = sorted(unique_list(value)) + if self._features is not None and value_unique != self._features: + raise MesonException("Cannot modify features after they have been selected or used") + self._features = value_unique + def get_build_def_files(self) -> T.List[str]: return self.build_def_files + def load_workspace(self, subdir: str) -> WorkspaceState: + """Load the root Cargo.toml package and prepare it with features and dependencies.""" + subdir = os.path.normpath(subdir) + manifest, cached = self._load_manifest(subdir) + ws = self._get_workspace(manifest, subdir, False) + if not cached: + self._prepare_entry_point(ws) + return ws + def _prepare_entry_point(self, ws: WorkspaceState) -> None: pkgs = [self._require_workspace_member(ws, m) for m in ws.workspace.default_members] for pkg in pkgs: self._prepare_package(pkg) - self._enable_feature(pkg, 'default') + for feature in self.features: + self._enable_feature(pkg, feature) + + def load_package(self, ws: WorkspaceState, package_name: T.Optional[str]) -> PackageState: + if package_name is None: + if not ws.workspace.root_package: + raise MesonException('no root package in workspace') + path = '.' + else: + try: + path = ws.packages_to_member[package_name] + except KeyError: + raise MesonException(f'workspace member "{package_name}" not found') + + if is_parent_path(self.subprojects_dir, path): + raise MesonException('argument to package() cannot be a subproject') + return ws.packages[path] def interpret(self, subdir: str, project_root: T.Optional[str] = None) -> mparser.CodeBlockNode: - manifest, cached = self._load_manifest(subdir) filename = os.path.join(self.environment.source_dir, subdir, 'Cargo.toml') build = builder.Builder(filename) if project_root: # this is a subdir() + manifest, _ = self._load_manifest(subdir) assert isinstance(manifest, Manifest) return self.interpret_package(manifest, build, subdir, project_root) - - ws = self._get_workspace(manifest, subdir, downloaded=False) - if not cached: - self._prepare_entry_point(ws) - return self.interpret_workspace(ws, build, subdir) + else: + ws = self.load_workspace(subdir) + return self.interpret_workspace(ws, build, subdir) def interpret_package(self, manifest: Manifest, build: builder.Builder, subdir: str, project_root: str) -> mparser.CodeBlockNode: # Build an AST for this package @@ -253,9 +354,10 @@ class Interpreter: return build.block(ast) def _create_package(self, pkg: PackageState, build: builder.Builder, subdir: str) -> T.List[mparser.BaseNode]: - cfg = pkg.cfg ast: T.List[mparser.BaseNode] = [ - build.assign(build.array([build.string(f) for f in cfg.features]), 'features'), + build.assign(build.method('package', build.identifier('cargo_ws'), + [build.string(pkg.manifest.package.name)]), 'pkg_obj'), + build.assign(build.method('features', build.identifier('pkg_obj')), 'features'), build.function('message', [ build.string('Enabled features:'), build.identifier('features'), @@ -268,16 +370,8 @@ class Interpreter: crate_type = pkg.manifest.lib.crate_type if 'dylib' in crate_type and 'cdylib' in crate_type: raise MesonException('Cannot build both dylib and cdylib due to file name conflict') - if 'proc-macro' in crate_type: - ast.extend(self._create_lib(pkg, build, subdir, 'proc-macro', shared=True)) - if any(x in crate_type for x in ['lib', 'rlib', 'dylib']): - ast.extend(self._create_lib(pkg, build, subdir, 'rust', - static=('lib' in crate_type or 'rlib' in crate_type), - shared='dylib' in crate_type)) - if any(x in crate_type for x in ['staticlib', 'cdylib']): - ast.extend(self._create_lib(pkg, build, subdir, 'c', - static='staticlib' in crate_type, - shared='cdylib' in crate_type)) + for abi in pkg.supported_abis(): + ast.extend(self._create_lib(pkg, build, subdir, abi)) return ast @@ -311,7 +405,6 @@ class Interpreter: ast.append(build.function('subdir', [build.string(member)])) processed_members[member] = pkg - ast.append(build.assign(build.function('import', [build.string('rust')]), 'rust')) for member in ws.required_members: _process_member(member) ast = self._create_project(name, processed_members.get('.'), build) + ast @@ -386,6 +479,13 @@ class Interpreter: raise MesonException(f'Cannot determine version of cargo package {package_name}') return None + def resolve_package(self, package_name: str, api: str) -> T.Optional[PackageState]: + cargo_pkg = self._resolve_package(package_name, version.convert(api)) + if not cargo_pkg: + return None + api = version.api(cargo_pkg.version) + return self._fetch_package(package_name, api) + def _fetch_package_from_subproject(self, package_name: str, meson_depname: str) -> PackageState: subp_name, _ = self.environment.wrap_resolver.find_dep_provider(meson_depname) if subp_name is None: @@ -426,6 +526,20 @@ class Interpreter: for condition, dependencies in pkg.manifest.target.items(): if eval_cfg(condition, cfgs): pkg.manifest.dependencies.update(dependencies) + + # If you specify the optional dependency with the dep: prefix anywhere in the [features] + # table, that disables the implicit feature. + deps = set(feature[4:] + for feature in itertools.chain.from_iterable(pkg.manifest.features.values()) + if feature.startswith('dep:')) + for name, dep in itertools.chain(pkg.manifest.dependencies.items(), + pkg.manifest.dev_dependencies.items(), + pkg.manifest.build_dependencies.items()): + if dep.optional and name not in deps: + pkg.manifest.features.setdefault(name, []) + pkg.manifest.features[name].append(f'dep:{name}') + deps.add(name) + # Fetch required dependencies recursively. for depname, dep in pkg.manifest.dependencies.items(): if not dep.optional: @@ -463,8 +577,12 @@ class Interpreter: return manifest_, True path = os.path.join(self.environment.source_dir, subdir) filename = os.path.join(path, 'Cargo.toml') + try: + raw_manifest = T.cast('raw.Manifest', load_toml(filename)) + except OSError as e: + raise MesonException(f'could not load {subdir}/Cargo.toml: {e}') + self.build_def_files.append(filename) - raw_manifest = T.cast('raw.Manifest', load_toml(filename)) if 'workspace' in raw_manifest: manifest_ = Workspace.from_raw(raw_manifest, path) elif 'package' in raw_manifest: @@ -497,9 +615,6 @@ class Interpreter: if feature in cfg.features: return cfg.features.add(feature) - # A feature can also be a dependency. - if feature in pkg.manifest.dependencies: - self._add_dependency(pkg, feature) # Recurse on extra features and dependencies this feature pulls. # https://doc.rust-lang.org/cargo/reference/features.html#the-features-section for f in pkg.manifest.features.get(feature, []): @@ -566,25 +681,30 @@ class Interpreter: # for the upkeep of the module 'meson_version': build.string(f'>= {coredata.stable_version}'), } - if not pkg: - return [ - build.function('project', args, kwargs), - ] - - default_options: T.Dict[str, mparser.BaseNode] = {} - if pkg.downloaded: - default_options['warning_level'] = build.string('0') - - kwargs.update({ - 'version': build.string(pkg.manifest.package.version), - 'default_options': build.dict({build.string(k): v for k, v in default_options.items()}), - }) - if pkg.manifest.package.license: - kwargs['license'] = build.string(pkg.manifest.package.license) - elif pkg.manifest.package.license_file: - kwargs['license_files'] = build.string(pkg.manifest.package.license_file) - - return [build.function('project', args, kwargs)] + if pkg: + default_options: T.Dict[str, mparser.BaseNode] = {} + if pkg.downloaded: + default_options['warning_level'] = build.string('0') + + kwargs.update({ + 'version': build.string(pkg.manifest.package.version), + 'default_options': build.dict({build.string(k): v for k, v in default_options.items()}), + }) + if pkg.manifest.package.license: + kwargs['license'] = build.string(pkg.manifest.package.license) + elif pkg.manifest.package.license_file: + kwargs['license_files'] = build.string(pkg.manifest.package.license_file) + + # project(...) + # rust = import('rust') + # cargo_ws = rust.workspace(dev_dependencies: False) + return [ + build.function('project', args, kwargs), + build.assign(build.function('import', [build.string('rust')]), + 'rust'), + build.assign(build.method('workspace', build.identifier('rust'), []), + 'cargo_ws') + ] def _create_dependencies(self, pkg: PackageState, build: builder.Builder) -> T.List[mparser.BaseNode]: cfg = pkg.cfg @@ -594,47 +714,40 @@ class Interpreter: dep_pkg = self._dep_package(pkg, dep) if dep_pkg.manifest.lib: ast += self._create_dependency(dep_pkg, dep, build) - ast.append(build.assign(build.array([]), 'system_deps_args')) - for name, sys_dep in pkg.manifest.system_dependencies.items(): - if sys_dep.enabled(cfg.features): - ast += self._create_system_dependency(name, sys_dep, build) return ast - def _create_system_dependency(self, name: str, dep: SystemDependency, build: builder.Builder) -> T.List[mparser.BaseNode]: - # TODO: handle feature_overrides - kw = { - 'version': build.array([build.string(s) for s in dep.meson_version]), - 'required': build.bool(not dep.optional), - } - varname = f'{fixup_meson_varname(name)}_system_dep' - cfg = f'system_deps_have_{fixup_meson_varname(name)}' - return [ - build.assign( - build.function( - 'dependency', - [build.string(dep.name)], - kw, - ), - varname, - ), - build.if_( - build.method('found', build.identifier(varname)), build.block([ - build.plusassign( - build.array([build.string('--cfg'), build.string(cfg)]), - 'system_deps_args' - ), - ]) - ), - ] - def _create_dependency(self, pkg: PackageState, dep: Dependency, build: builder.Builder) -> T.List[mparser.BaseNode]: cfg = pkg.cfg - version_ = dep.meson_version or [pkg.manifest.package.version] - kw = { - 'version': build.array([build.string(s) for s in version_]), - } - # Lookup for this dependency with the features we want in default_options kwarg. - # + feat_obj: mparser.BaseNode + if self.cargolock and self.resolve_package(dep.package, dep.api): + # actual_features = cargo_ws.subproject(...).features() + feat_obj = build.method( + 'features', + build.method( + 'subproject', + build.identifier('cargo_ws'), + [build.string(dep.package), build.string(dep.api)])) + else: + version_ = dep.meson_version or [pkg.manifest.package.version] + kw = { + 'version': build.array([build.string(s) for s in version_]), + } + # actual_features = dependency(...).get_variable('features', default_value : '').split(',') + dep_obj = build.function( + 'dependency', + [build.string(_dependency_name(dep.package, dep.api))], + kw) + feat_obj = build.method( + 'split', + build.method( + 'get_variable', + dep_obj, + [build.string('features')], + {'default_value': build.string('')} + ), + [build.string(',')], + ) + # However, this subproject could have been previously configured with a # different set of features. Cargo collects the set of features globally # but Meson can only use features enabled by the first call that triggered @@ -645,27 +758,9 @@ class Interpreter: # option manually with -Dxxx-rs:feature-yyy=true, or the main project can do # that in its project(..., default_options: ['xxx-rs:feature-yyy=true']). return [ - # xxx_dep = dependency('xxx', version : ...) + # actual_features = dependency(...).get_variable('features', default_value : '').split(',') build.assign( - build.function( - 'dependency', - [build.string(_dependency_name(dep.package, dep.api))], - kw, - ), - _dependency_varname(dep), - ), - # actual_features = xxx_dep.get_variable('features', default_value : '').split(',') - build.assign( - build.method( - 'split', - build.method( - 'get_variable', - build.identifier(_dependency_varname(dep)), - [build.string('features')], - {'default_value': build.string('')} - ), - [build.string(',')], - ), + feat_obj, 'actual_features' ), # needed_features = [f1, f2, ...] @@ -708,60 +803,21 @@ class Interpreter: ] def _create_lib(self, pkg: PackageState, build: builder.Builder, subdir: str, - lib_type: Literal['rust', 'c', 'proc-macro'], - static: bool = False, shared: bool = False) -> T.List[mparser.BaseNode]: - cfg = pkg.cfg - dependencies: T.List[mparser.BaseNode] = [] - for name in cfg.required_deps: - dep = pkg.manifest.dependencies[name] - dependencies.append(build.identifier(_dependency_varname(dep))) - - dependency_map: T.Dict[mparser.BaseNode, mparser.BaseNode] = { - build.string(k): build.string(v) for k, v in cfg.get_dependency_map(pkg.manifest).items()} - - for name, sys_dep in pkg.manifest.system_dependencies.items(): - if sys_dep.enabled(cfg.features): - dependencies.append(build.identifier(f'{fixup_meson_varname(name)}_system_dep')) - - rustc_args_list = pkg.get_rustc_args(self.environment, subdir, MachineChoice.HOST) - extra_args_ref = build.identifier(_extra_args_varname()) - system_deps_args_ref = build.identifier('system_deps_args') - rust_args: T.List[mparser.BaseNode] = [build.string(a) for a in rustc_args_list] - rust_args.append(extra_args_ref) - rust_args.append(system_deps_args_ref) - - dependencies.append(build.identifier(_extra_deps_varname())) - - override_options: T.Dict[mparser.BaseNode, mparser.BaseNode] = { - build.string('rust_std'): build.string(pkg.manifest.package.edition), - } - + lib_type: RUST_ABI) -> T.List[mparser.BaseNode]: posargs: T.List[mparser.BaseNode] = [ - build.string(_library_name(pkg.manifest.lib.name, pkg.manifest.package.api, lib_type)), - build.string(pkg.manifest.lib.path), + build.string(pkg.library_name(lib_type)), ] kwargs: T.Dict[str, mparser.BaseNode] = { - 'dependencies': build.array(dependencies), - 'rust_dependency_map': build.dict(dependency_map), - 'rust_args': build.array(rust_args), - 'override_options': build.dict(override_options), + 'dependencies': build.identifier(_extra_deps_varname()), + 'rust_args': build.identifier(_extra_args_varname()), } - depname_suffix = '' if lib_type == 'c' else '-rs' - depname = _dependency_name(pkg.manifest.package.name, pkg.manifest.package.api, depname_suffix) - - lib: mparser.BaseNode if lib_type == 'proc-macro': - lib = build.method('proc_macro', build.identifier('rust'), posargs, kwargs) + lib = build.method('proc_macro', build.identifier('pkg_obj'), posargs, kwargs) else: - if static and shared: - target_type = 'both_libraries' - else: - target_type = 'shared_library' if shared else 'static_library' - kwargs['rust_abi'] = build.string(lib_type) - lib = build.function(target_type, posargs, kwargs) + lib = build.method('library', build.identifier('pkg_obj'), posargs, kwargs) # lib = xxx_library() # dep = declare_dependency() @@ -774,20 +830,19 @@ class Interpreter: kw={ 'link_with': build.identifier('lib'), 'variables': build.dict({ - build.string('features'): build.string(','.join(cfg.features)), + build.string('features'): build.method('join', build.string(','), + [build.identifier('features')]), }), - 'version': build.string(pkg.manifest.package.version), + 'version': build.method('version', build.identifier('pkg_obj')), }, ), 'dep' ), build.method( 'override_dependency', - build.identifier('meson'), - [ - build.string(depname), - build.identifier('dep'), - ], + build.identifier('pkg_obj'), + [build.identifier('dep')], + {'rust_abi': build.string(lib_type)} ), ] @@ -825,7 +880,7 @@ def load_cargo_lock(filename: str, subproject_dir: str) -> T.Optional[CargoLock] meson_depname = _dependency_name(package.name, version.api(package.version)) if package.source is None: # This is project's package, or one of its workspace members. - pass + continue elif package.source == 'registry+https://github.com/rust-lang/crates.io-index': checksum = package.checksum if checksum is None: diff --git a/mesonbuild/cargo/manifest.py b/mesonbuild/cargo/manifest.py index ec84e4b16..125319972 100644 --- a/mesonbuild/cargo/manifest.py +++ b/mesonbuild/cargo/manifest.py @@ -79,6 +79,28 @@ class ConvertValue(DefaultValue): return self.func(v if v is not None else ws_v) +class DictMergeValue(ConvertValue): + """Merge the incoming array of tables with a dictionary; + a user-provided function maps each table to one of the + entries of the dictionary.""" + + def __init__(self, func: T.Callable[[T.Any], T.List[object]], key: T.Callable[[T.Any], str], base: T.Mapping[str, object] = None) -> None: + super().__init__(func, base) + self.key = key + + def convert(self, v: T.Any, ws_v: T.Any) -> object: + out = self.func(v if v is not None else ws_v) + assert isinstance(out, list) # for mypy + assert isinstance(self.default, dict) # for mypy + + d = {self.key(x): x for x in out} + # FIXME: check how auto-discovered items are merged with Cargo.toml + for k, v in self.default.items(): + if k not in d: + d[k] = v + return d + + def _raw_to_dataclass(raw: T.Mapping[str, object], cls: T.Type[_DI], msg: str, raw_from_workspace: T.Optional[T.Mapping[str, object]] = None, ignored_fields: T.Optional[T.List[str]] = None, @@ -442,8 +464,7 @@ class Lint: settings = T.cast('raw.Lint', {'level': settings}) check_cfg = None if name == 'unexpected_cfgs': - # 'cfg(test)' is added automatically by cargo - check_cfg = ['cfg(test)'] + settings.get('check-cfg', []) + check_cfg = settings.get('check-cfg', []) lints[name] = Lint(name=name, level=settings['level'], priority=settings.get('priority', 0), @@ -491,7 +512,7 @@ class Manifest: dev_dependencies: T.Dict[str, Dependency] = dataclasses.field(default_factory=dict) build_dependencies: T.Dict[str, Dependency] = dataclasses.field(default_factory=dict) lib: T.Optional[Library] = None - bin: T.List[Binary] = dataclasses.field(default_factory=list) + bin: T.Dict[str, Binary] = dataclasses.field(default_factory=dict) test: T.List[Test] = dataclasses.field(default_factory=list) bench: T.List[Benchmark] = dataclasses.field(default_factory=list) example: T.List[Example] = dataclasses.field(default_factory=list) @@ -516,6 +537,32 @@ class Manifest: if pkg.autolib and os.path.exists(os.path.join(path, 'src/lib.rs')): autolib = Library.from_raw({}, pkg) + def _discover_targets(subdir: str) -> T.Generator[T.Tuple[str, str], None, None]: + """Discover .rs files in a subdirectory and yield (name, path) tuples.""" + target_dir = os.path.join(path, subdir) + if os.path.isdir(target_dir): + for entry in os.listdir(target_dir): + if entry.endswith('.rs'): + target_name = entry[:-3] # Remove .rs extension + yield target_name, f'{subdir}/{entry}' + + autobins: T.Dict[str, Binary] = {} + if pkg.autobins: + # Check for default binary (src/main.rs) + if os.path.exists(os.path.join(path, 'src/main.rs')): + autobins[pkg.name] = Binary.from_raw({'name': pkg.name, 'path': 'src/main.rs'}, pkg) + # Add additional binaries from src/bin/ + for bin_name, bin_path in _discover_targets('src/bin'): + autobins[bin_name] = Binary.from_raw({'name': bin_name, 'path': bin_path}, pkg) + + # Check for additional binaries in src/bin/ + bin_dir = os.path.join(path, 'src/bin') + if os.path.isdir(bin_dir): + for entry in os.listdir(bin_dir): + if entry.endswith('.rs'): + bin_name = entry[:-3] # Remove .rs extension + autobins[bin_name] = Binary.from_raw({'name': bin_name, 'path': f'src/bin/{entry}'}, pkg) + def dependencies_from_raw(x: T.Dict[str, T.Any]) -> T.Dict[str, Dependency]: return {k: Dependency.from_raw(k, v, member_path, workspace) for k, v in x.items()} @@ -528,7 +575,9 @@ class Manifest: build_dependencies=ConvertValue(dependencies_from_raw), lints=ConvertValue(Lint.from_raw), lib=ConvertValue(lambda x: Library.from_raw(x, pkg), default=autolib), - bin=ConvertValue(lambda x: [Binary.from_raw(b, pkg) for b in x]), + bin=DictMergeValue(lambda x: [Binary.from_raw(b, pkg) for b in x], + lambda x: x.name, + base=autobins), test=ConvertValue(lambda x: [Test.from_raw(b, pkg) for b in x]), bench=ConvertValue(lambda x: [Benchmark.from_raw(b, pkg) for b in x]), example=ConvertValue(lambda x: [Example.from_raw(b, pkg) for b in x]), diff --git a/mesonbuild/cargo/toml.py b/mesonbuild/cargo/toml.py index 601510e4e..dda7cfda8 100644 --- a/mesonbuild/cargo/toml.py +++ b/mesonbuild/cargo/toml.py @@ -31,17 +31,28 @@ class TomlImplementationMissing(MesonException): pass +class CargoTomlError(MesonException): + """Exception for TOML parsing errors, keeping proper location info.""" + + def load_toml(filename: str) -> T.Dict[str, object]: if tomllib: - with open(filename, 'rb') as f: - raw = tomllib.load(f) + try: + with open(filename, 'rb') as f: + raw = tomllib.load(f) + except tomllib.TOMLDecodeError as e: + if hasattr(e, 'msg'): + raise CargoTomlError(e.msg, file=filename, lineno=e.lineno, colno=e.colno) from e + else: + raise CargoTomlError(str(e), file=filename) from e else: if toml2json is None: raise TomlImplementationMissing('Could not find an implementation of tomllib, nor toml2json') p, out, err = Popen_safe([toml2json, filename]) if p.returncode != 0: - raise MesonException('toml2json failed to decode output\n', err) + error_msg = err.strip() or 'toml2json failed to decode TOML' + raise CargoTomlError(error_msg, file=filename) raw = json.loads(out) diff --git a/mesonbuild/compilers/detect.py b/mesonbuild/compilers/detect.py index db2bdf6ab..eb42bb67e 100644 --- a/mesonbuild/compilers/detect.py +++ b/mesonbuild/compilers/detect.py @@ -701,7 +701,7 @@ def detect_cuda_compiler(env: 'Environment', for_machine: MachineChoice) -> Comp cls = CudaCompiler env.add_lang_args(cls.language, cls, for_machine) key = OptionKey('cuda_link_args', machine=for_machine) - if env.is_cross_build(for_machine): + if not env.is_cross_build(for_machine): key = key.as_host() if key in env.options: # To fix LDFLAGS issue diff --git a/mesonbuild/compilers/rust.py b/mesonbuild/compilers/rust.py index ab0706d26..7ad7f34e9 100644 --- a/mesonbuild/compilers/rust.py +++ b/mesonbuild/compilers/rust.py @@ -11,6 +11,7 @@ import re import typing as T from .. import options +from ..dependencies import InternalDependency from ..mesonlib import EnvironmentException, MesonException, Popen_safe_logged, version_compare from ..linkers.linkers import VisualStudioLikeLinkerMixin from ..options import OptionKey @@ -73,6 +74,11 @@ def rustc_link_args(args: T.List[str]) -> T.List[str]: rustc_args.append(f'link-arg={arg}') return rustc_args + +class RustSystemDependency(InternalDependency): + pass + + class RustCompiler(Compiler): # rustc doesn't invoke the compiler itself, it doesn't need a LINKER_PREFIX @@ -323,6 +329,8 @@ class RustCompiler(Compiler): return opts def get_dependency_compile_args(self, dep: 'Dependency') -> T.List[str]: + if isinstance(dep, RustSystemDependency): + return dep.get_compile_args() # Rust doesn't have dependency compile arguments so simply return # nothing here. Dependencies are linked and all required metadata is # provided by the linker flags. diff --git a/mesonbuild/coredata.py b/mesonbuild/coredata.py index 8d7a1c557..5d689e9d0 100644 --- a/mesonbuild/coredata.py +++ b/mesonbuild/coredata.py @@ -351,6 +351,8 @@ class CoreData: # key and target have the same subproject for consistency. # Now just do this to get things going. newkey = newkey.evolve(subproject=target.subproject) + if self.is_cross_build(): + newkey = newkey.evolve(machine=target.for_machine) option_object, value = self.optstore.get_option_and_value_for(newkey) override = target.get_override(newkey.name) if override is not None: diff --git a/mesonbuild/dependencies/base.py b/mesonbuild/dependencies/base.py index 4536746d9..9df8b5450 100644 --- a/mesonbuild/dependencies/base.py +++ b/mesonbuild/dependencies/base.py @@ -7,6 +7,7 @@ from __future__ import annotations import copy +import dataclasses import os import collections import itertools @@ -21,7 +22,7 @@ from ..options import OptionKey #from ..interpreterbase import FeatureDeprecated, FeatureNew if T.TYPE_CHECKING: - from typing_extensions import Literal, TypedDict, TypeAlias + from typing_extensions import Literal, Required, TypedDict, TypeAlias from ..compilers.compilers import Compiler from ..environment import Environment @@ -51,7 +52,7 @@ if T.TYPE_CHECKING: main: bool method: DependencyMethods modules: T.List[str] - native: MachineChoice + native: Required[MachineChoice] optional_modules: T.List[str] private_headers: bool required: bool @@ -73,6 +74,8 @@ else: _MissingCompilerBase = object +DepType = T.TypeVar('DepType', bound='ExternalDependency', covariant=True) + class DependencyException(MesonException): '''Exceptions raised while trying to find dependencies''' @@ -129,15 +132,16 @@ DependencyTypeName = T.NewType('DependencyTypeName', str) class Dependency(HoldableObject): - def __init__(self, type_name: DependencyTypeName, kwargs: DependencyObjectKWs) -> None: + type_name: DependencyTypeName + + def __init__(self, kwargs: DependencyObjectKWs) -> None: # This allows two Dependencies to be compared even after being copied. # The purpose is to allow the name to be changed, but still have a proper comparison self._id = uuid.uuid4().int self.name = f'dep{self._id}' self.version: T.Optional[str] = None - self.language: T.Optional[str] = None # None means C-like + self.language: T.Optional[str] = kwargs.get('language') # None means C-like self.is_found = False - self.type_name = type_name self.compile_args: T.List[str] = [] self.link_args: T.List[str] = [] # Raw -L and -l arguments without manual library searching @@ -298,29 +302,34 @@ class Dependency(HoldableObject): return self class InternalDependency(Dependency): - def __init__(self, version: str, incdirs: T.List['IncludeDirs'], compile_args: T.List[str], - link_args: T.List[str], - libraries: T.List[LibTypes], - whole_libraries: T.List[T.Union[StaticLibrary, CustomTarget, CustomTargetIndex]], - sources: T.Sequence[T.Union[mesonlib.File, GeneratedTypes, StructuredSources]], - extra_files: T.Sequence[mesonlib.File], - ext_deps: T.List[Dependency], variables: T.Dict[str, str], - d_module_versions: T.List[T.Union[str, int]], d_import_dirs: T.List['IncludeDirs'], - objects: T.List['ExtractedObjects'], + + type_name = DependencyTypeName('internal') + + def __init__(self, version: str, incdirs: T.Optional[T.List['IncludeDirs']] = None, + compile_args: T.Optional[T.List[str]] = None, + link_args: T.Optional[T.List[str]] = None, + libraries: T.Optional[T.List[LibTypes]] = None, + whole_libraries: T.Optional[T.List[T.Union[StaticLibrary, CustomTarget, CustomTargetIndex]]] = None, + sources: T.Optional[T.Sequence[T.Union[mesonlib.File, GeneratedTypes, StructuredSources]]] = None, + extra_files: T.Optional[T.Sequence[mesonlib.File]] = None, + ext_deps: T.Optional[T.List[Dependency]] = None, variables: T.Optional[T.Dict[str, str]] = None, + d_module_versions: T.Optional[T.List[T.Union[str, int]]] = None, + d_import_dirs: T.Optional[T.List['IncludeDirs']] = None, + objects: T.Optional[T.List['ExtractedObjects']] = None, name: T.Optional[str] = None): - super().__init__(DependencyTypeName('internal'), {}) + super().__init__({'native': MachineChoice.HOST}) # TODO: does the native key actually matter self.version = version self.is_found = True - self.include_directories = incdirs - self.compile_args = compile_args - self.link_args = link_args - self.libraries = libraries - self.whole_libraries = whole_libraries - self.sources = list(sources) - self.extra_files = list(extra_files) - self.ext_deps = ext_deps - self.variables = variables - self.objects = objects + self.include_directories = incdirs or [] + self.compile_args = compile_args or [] + self.link_args = link_args or [] + self.libraries = libraries or [] + self.whole_libraries = whole_libraries or [] + self.sources = list(sources or []) + self.extra_files = list(extra_files or []) + self.ext_deps = ext_deps or [] + self.variables = variables or {} + self.objects = objects or [] if d_module_versions: self.d_features['versions'] = d_module_versions if d_import_dirs: @@ -412,12 +421,11 @@ class InternalDependency(Dependency): return new_dep class ExternalDependency(Dependency): - def __init__(self, type_name: DependencyTypeName, environment: 'Environment', kwargs: DependencyObjectKWs, language: T.Optional[str] = None): - Dependency.__init__(self, type_name, kwargs) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs): + Dependency.__init__(self, kwargs) self.env = environment - self.name = type_name # default + self.name = name self.is_found = False - self.language = language self.version_reqs = kwargs.get('version', []) self.required = kwargs.get('required', True) self.silent = kwargs.get('silent', False) @@ -427,7 +435,7 @@ class ExternalDependency(Dependency): self.static = static self.libtype = LibType.STATIC if self.static else LibType.PREFER_SHARED # Is this dependency to be run on the build platform? - self.for_machine = kwargs.get('native', MachineChoice.HOST) + self.for_machine = kwargs['native'] self.clib_compiler = detect_compiler(self.name, environment, self.for_machine, self.language) def get_compiler(self) -> T.Union['MissingCompiler', 'Compiler']: @@ -457,10 +465,6 @@ class ExternalDependency(Dependency): def log_info(self) -> str: return '' - @staticmethod - def log_tried() -> str: - return '' - # Check if dependency version meets the requirements def _check_version(self) -> None: if not self.is_found: @@ -499,8 +503,11 @@ class ExternalDependency(Dependency): class NotFoundDependency(Dependency): + + type_name = DependencyTypeName('not-found') + def __init__(self, name: str, environment: 'Environment') -> None: - super().__init__(DependencyTypeName('not-found'), {}) + super().__init__({'native': MachineChoice.HOST}) # TODO: does this actually matter? self.env = environment self.name = name self.is_found = False @@ -514,11 +521,12 @@ class NotFoundDependency(Dependency): class ExternalLibrary(ExternalDependency): + + type_name = DependencyTypeName('library') + def __init__(self, name: str, link_args: T.List[str], environment: 'Environment', - language: str, silent: bool = False) -> None: - super().__init__(DependencyTypeName('library'), environment, {}, language=language) - self.name = name - self.language = language + language: str, for_machine: MachineChoice, silent: bool = False) -> None: + super().__init__(name, environment, {'language': language, 'native': for_machine}) self.is_found = False if link_args: self.is_found = True @@ -660,25 +668,44 @@ class SystemDependency(ExternalDependency): """Dependency base for System type dependencies.""" - def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs, - language: T.Optional[str] = None) -> None: - super().__init__(DependencyTypeName('system'), env, kwargs, language=language) - self.name = name - - @staticmethod - def log_tried() -> str: - return 'system' + type_name = DependencyTypeName('system') class BuiltinDependency(ExternalDependency): """Dependency base for Builtin type dependencies.""" - def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs, - language: T.Optional[str] = None) -> None: - super().__init__(DependencyTypeName('builtin'), env, kwargs, language=language) - self.name = name + type_name = DependencyTypeName('builtin') + + +@dataclasses.dataclass +class DependencyCandidate(T.Generic[DepType]): + + callable: T.Union[T.Type[DepType], T.Callable[[str, Environment, DependencyObjectKWs], DepType]] + name: str + method: str + modules: T.Optional[T.List[str]] = None + arguments: T.Optional[T.Tuple[Environment, DependencyObjectKWs]] = dataclasses.field(default=None) + + def __call__(self) -> DepType: + if self.arguments is None: + raise mesonlib.MesonBugException('Attempted to instantiate a candidate before setting its arguments') + env, kwargs = self.arguments + if self.modules is not None: + kwargs['modules'] = self.modules.copy() + return self.callable(self.name, env, kwargs) + + @classmethod + def from_dependency(cls, name: str, dep: T.Type[DepType], + args: T.Optional[T.Tuple[Environment, DependencyObjectKWs]] = None, + modules: T.Optional[T.List[str]] = None, + ) -> DependencyCandidate[DepType]: + tried = str(dep.type_name) + + # fixup the cases where type_name and log tried don't match + if tried in {'extraframeworks', 'appleframeworks'}: + tried = 'framework' + elif tried == 'pkgconfig': + tried = 'pkg-config' - @staticmethod - def log_tried() -> str: - return 'builtin' + return cls(dep, name, tried, modules, arguments=args) diff --git a/mesonbuild/dependencies/boost.py b/mesonbuild/dependencies/boost.py index 1aeb451f1..02c449cb2 100644 --- a/mesonbuild/dependencies/boost.py +++ b/mesonbuild/dependencies/boost.py @@ -340,8 +340,9 @@ class BoostLibraryFile(): return [self.path.as_posix()] class BoostDependency(SystemDependency): - def __init__(self, environment: Environment, kwargs: DependencyObjectKWs) -> None: - super().__init__('boost', environment, kwargs, language='cpp') + def __init__(self, name: str, environment: Environment, kwargs: DependencyObjectKWs) -> None: + kwargs['language'] = 'cpp' + super().__init__(name, environment, kwargs) buildtype = environment.coredata.optstore.get_value_for(OptionKey('buildtype')) assert isinstance(buildtype, str) self.debug = buildtype.startswith('debug') @@ -361,7 +362,7 @@ class BoostDependency(SystemDependency): # Do we need threads? if 'thread' in self.modules: - if not self._add_sub_dependency(threads_factory(environment, self.for_machine, {})): + if not self._add_sub_dependency(threads_factory(environment, {'native': self.for_machine})): self.is_found = False return @@ -676,7 +677,7 @@ class BoostDependency(SystemDependency): # Try getting the BOOST_ROOT from a boost.pc if it exists. This primarily # allows BoostDependency to find boost from Conan. See #5438 try: - boost_pc = PkgConfigDependency('boost', self.env, {'required': False}) + boost_pc = PkgConfigDependency('boost', self.env, {'required': False, 'native': self.for_machine}) if boost_pc.found(): boost_lib_dir = boost_pc.get_variable(pkgconfig='libdir') boost_inc_dir = boost_pc.get_variable(pkgconfig='includedir') diff --git a/mesonbuild/dependencies/cmake.py b/mesonbuild/dependencies/cmake.py index 066bb2900..5fc207317 100644 --- a/mesonbuild/dependencies/cmake.py +++ b/mesonbuild/dependencies/cmake.py @@ -4,7 +4,7 @@ from __future__ import annotations from .base import ExternalDependency, DependencyException, DependencyTypeName -from ..mesonlib import is_windows, MesonException, PerMachine, MachineChoice +from ..mesonlib import is_windows, MesonException, PerMachine from ..cmake import CMakeExecutor, CMakeTraceParser, CMakeException, CMakeToolchain, CMakeExecScope, check_cmake_args, resolve_cmake_trace_targets, cmake_is_debug from .. import mlog import importlib.resources @@ -39,6 +39,8 @@ class CMakeDependency(ExternalDependency): class_cmake_generators = ['', 'Ninja', 'Unix Makefiles', 'Visual Studio 10 2010'] class_working_generator: T.Optional[str] = None + type_name = DependencyTypeName('cmake') + def _gen_exception(self, msg: str) -> DependencyException: return DependencyException(f'Dependency {self.name} not found: {msg}') @@ -70,11 +72,12 @@ class CMakeDependency(ExternalDependency): # one module return module - def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs, language: T.Optional[str] = None, force_use_global_compilers: bool = False) -> None: + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs, force_use_global_compilers: bool = False) -> None: # Gather a list of all languages to support self.language_list: T.List[str] = [] + language = kwargs.get('language') if language is None or force_use_global_compilers: - for_machine = kwargs.get('native', MachineChoice.HOST) + for_machine = kwargs['native'] compilers = environment.coredata.compilers[for_machine] candidates = ['c', 'cpp', 'fortran', 'objc', 'objcxx'] self.language_list += [x for x in candidates if x in compilers] @@ -88,8 +91,7 @@ class CMakeDependency(ExternalDependency): # Ensure that the list is unique self.language_list = list(set(self.language_list)) - super().__init__(DependencyTypeName('cmake'), environment, kwargs, language=language) - self.name = name + super().__init__(name, environment, kwargs) self.is_libtool = False # Where all CMake "build dirs" are located @@ -607,10 +609,6 @@ class CMakeDependency(ExternalDependency): build_dir = self._setup_cmake_dir(cmake_file) return self.cmakebin.call(args, build_dir, env=env) - @staticmethod - def log_tried() -> str: - return 'cmake' - def log_details(self) -> str: modules = [self._original_module_name(x) for x in self.found_modules] modules = sorted(set(modules)) @@ -643,22 +641,6 @@ class CMakeDependency(ExternalDependency): raise DependencyException(f'Could not get cmake variable and no default provided for {self!r}') -class CMakeDependencyFactory: - - def __init__(self, name: T.Optional[str] = None, modules: T.Optional[T.List[str]] = None): - self.name = name - self.modules = modules - - def __call__(self, name: str, env: Environment, kwargs: DependencyObjectKWs, language: T.Optional[str] = None, force_use_global_compilers: bool = False) -> CMakeDependency: - if self.modules: - kwargs['modules'] = self.modules - return CMakeDependency(self.name or name, env, kwargs, language, force_use_global_compilers) - - @staticmethod - def log_tried() -> str: - return CMakeDependency.log_tried() - - def sort_link_args(args: T.List[str]) -> T.List[str]: itr = iter(args) result: T.Set[T.Union[T.Tuple[str], T.Tuple[str, str]]] = set() diff --git a/mesonbuild/dependencies/coarrays.py b/mesonbuild/dependencies/coarrays.py index a4dbdc535..ddb50058f 100644 --- a/mesonbuild/dependencies/coarrays.py +++ b/mesonbuild/dependencies/coarrays.py @@ -3,10 +3,9 @@ from __future__ import annotations -import functools import typing as T -from .base import DependencyMethods, detect_compiler, SystemDependency +from .base import DependencyCandidate, DependencyMethods, detect_compiler, SystemDependency from .cmake import CMakeDependency from .detect import packages from .pkgconfig import PkgConfigDependency @@ -15,15 +14,15 @@ from .factory import factory_methods if T.TYPE_CHECKING: from . factory import DependencyGenerator from ..environment import Environment - from ..mesonlib import MachineChoice from .base import DependencyObjectKWs @factory_methods({DependencyMethods.PKGCONFIG, DependencyMethods.CMAKE, DependencyMethods.SYSTEM}) def coarray_factory(env: 'Environment', - for_machine: 'MachineChoice', kwargs: DependencyObjectKWs, methods: T.List[DependencyMethods]) -> T.List['DependencyGenerator']: + kwargs['language'] = 'fortran' + for_machine = kwargs['native'] fcid = detect_compiler('coarray', env, for_machine, 'fortran').get_id() candidates: T.List['DependencyGenerator'] = [] @@ -31,19 +30,24 @@ def coarray_factory(env: 'Environment', # OpenCoarrays is the most commonly used method for Fortran Coarray with GCC if DependencyMethods.PKGCONFIG in methods: for pkg in ['caf-openmpi', 'caf']: - candidates.append(functools.partial( - PkgConfigDependency, pkg, env, kwargs, language='fortran')) + candidates.append(DependencyCandidate.from_dependency( + pkg, PkgConfigDependency, (env, kwargs))) if DependencyMethods.CMAKE in methods: + nkwargs = kwargs if not kwargs.get('modules'): - kwargs['modules'] = ['OpenCoarrays::caf_mpi'] - candidates.append(functools.partial( - CMakeDependency, 'OpenCoarrays', env, kwargs, language='fortran')) + nkwargs = kwargs.copy() + nkwargs['modules'] = ['OpenCoarrays::caf_mpi'] + candidates.append(DependencyCandidate.from_dependency( + 'OpenCoarrays', CMakeDependency, (env, nkwargs))) if DependencyMethods.SYSTEM in methods: - candidates.append(functools.partial(CoarrayDependency, env, kwargs)) + candidates.append(DependencyCandidate.from_dependency( + 'coarray', CoarrayDependency, (env, kwargs))) return candidates + + packages['coarray'] = coarray_factory @@ -56,10 +60,9 @@ class CoarrayDependency(SystemDependency): Coarrays may be thought of as a high-level language abstraction of low-level MPI calls. """ - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: - super().__init__('coarray', environment, kwargs, language='fortran') - kwargs['required'] = False - kwargs['silent'] = True + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: + kwargs['language'] = 'fortran' + super().__init__(name, environment, kwargs) cid = self.get_compiler().get_id() if cid == 'gcc': diff --git a/mesonbuild/dependencies/configtool.py b/mesonbuild/dependencies/configtool.py index e2721fe3b..3d018943c 100644 --- a/mesonbuild/dependencies/configtool.py +++ b/mesonbuild/dependencies/configtool.py @@ -36,10 +36,10 @@ class ConfigToolDependency(ExternalDependency): skip_version: T.Optional[str] = None allow_default_for_cross = False __strip_version = re.compile(r'^[0-9][0-9.]+') + type_name = DependencyTypeName('config-tool') - def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs, language: T.Optional[str] = None, exclude_paths: T.Optional[T.List[str]] = None): - super().__init__(DependencyTypeName('config-tool'), environment, kwargs, language=language) - self.name = name + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs, exclude_paths: T.Optional[T.List[str]] = None): + super().__init__(name, environment, kwargs) # You may want to overwrite the class version in some cases self.tools = listify(kwargs.get('tools', self.tools)) if not self.tool_name: @@ -150,10 +150,6 @@ class ConfigToolDependency(ExternalDependency): def get_variable_args(self, variable_name: str) -> T.List[str]: return [f'--{variable_name}'] - @staticmethod - def log_tried() -> str: - return 'config-tool' - def get_variable(self, *, cmake: T.Optional[str] = None, pkgconfig: T.Optional[str] = None, configtool: T.Optional[str] = None, internal: T.Optional[str] = None, system: T.Optional[str] = None, default_value: T.Optional[str] = None, diff --git a/mesonbuild/dependencies/cuda.py b/mesonbuild/dependencies/cuda.py index d80c62d8d..b3401ee03 100644 --- a/mesonbuild/dependencies/cuda.py +++ b/mesonbuild/dependencies/cuda.py @@ -28,16 +28,17 @@ class CudaDependency(SystemDependency): supported_languages = ['cpp', 'c', 'cuda'] # see also _default_language targets_dir = 'targets' # Directory containing CUDA targets. - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: - for_machine = kwargs.get('native', mesonlib.MachineChoice.HOST) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: + for_machine = kwargs['native'] compilers = environment.coredata.compilers[for_machine] machine = environment.machines[for_machine] - language = self._detect_language(compilers) + if not kwargs.get('language'): + kwargs['language'] = self._detect_language(compilers) - if language not in self.supported_languages: - raise DependencyException(f'Language \'{language}\' is not supported by the CUDA Toolkit. Supported languages are {self.supported_languages}.') + if kwargs['language'] not in self.supported_languages: + raise DependencyException(f'Language \'{kwargs["language"]}\' is not supported by the CUDA Toolkit. Supported languages are {self.supported_languages}.') - super().__init__('cuda', environment, kwargs, language=language) + super().__init__(name, environment, kwargs) self.lib_modules: T.Dict[str, T.List[str]] = {} self.requested_modules = kwargs.get('modules', []) if not any(runtime in self.requested_modules for runtime in ['cudart', 'cudart_static']): diff --git a/mesonbuild/dependencies/detect.py b/mesonbuild/dependencies/detect.py index f00075b0d..65e19c2aa 100644 --- a/mesonbuild/dependencies/detect.py +++ b/mesonbuild/dependencies/detect.py @@ -3,25 +3,29 @@ from __future__ import annotations -import collections, functools, importlib +import collections, importlib import enum import typing as T -from .base import ExternalDependency, DependencyException, DependencyMethods, NotFoundDependency +from .base import DependencyCandidate, ExternalDependency, DependencyException, DependencyMethods, NotFoundDependency -from ..mesonlib import listify, MachineChoice, PerMachine +from ..mesonlib import listify, PerMachine, MesonBugException, MesonException from .. import mlog if T.TYPE_CHECKING: from ..environment import Environment - from .factory import DependencyFactory, WrappedFactoryFunc, DependencyGenerator + from .factory import DependencyFactory, DependencyGenerator, WrappedFactoryFunc from .base import DependencyObjectKWs TV_DepIDEntry = T.Union[str, bool, int, None, T.Tuple[str, ...]] TV_DepID = T.Tuple[T.Tuple[str, TV_DepIDEntry], ...] - PackageTypes = T.Union[T.Type[ExternalDependency], DependencyFactory, WrappedFactoryFunc] + PackageTypes = T.Union[T.Type[ExternalDependency], DependencyFactory, DependencyCandidate, WrappedFactoryFunc] + # Workaround for older python + DependencyPackagesType = collections.UserDict[str, PackageTypes] +else: + DependencyPackagesType = collections.UserDict -class DependencyPackages(collections.UserDict): +class DependencyPackages(DependencyPackagesType): data: T.Dict[str, PackageTypes] defaults: T.Dict[str, str] = {} @@ -98,16 +102,17 @@ def find_external_dependency(name: str, env: 'Environment', kwargs: DependencyOb # display the dependency name with correct casing display_name = display_name_map.get(lname, lname) - for_machine = kwargs.get('native', MachineChoice.HOST) + for_machine = kwargs['native'] type_text = PerMachine('Build-time', 'Run-time')[for_machine] + ' dependency' # build a list of dependency methods to try if candidates is None: - candidates = _build_external_dependency_list(name, env, for_machine, kwargs) + candidates = _build_external_dependency_list(name, env, kwargs) pkg_exc: T.List[DependencyException] = [] pkgdep: T.List[ExternalDependency] = [] details = '' + tried_methods: T.List[str] = [] for c in candidates: # try this dependency method @@ -116,11 +121,15 @@ def find_external_dependency(name: str, env: 'Environment', kwargs: DependencyOb d._check_version() pkgdep.append(d) except DependencyException as e: - assert isinstance(c, functools.partial), 'for mypy' - bettermsg = f'Dependency lookup for {name} with method {c.func.log_tried()!r} failed: {e}' + bettermsg = f'Dependency lookup for {name} with method {c.method!r} failed: {e}' mlog.debug(bettermsg) e.args = (bettermsg,) pkg_exc.append(e) + except MesonException: + raise + except Exception as e: + bettermsg = f'Dependency lookup for {name} with method {c.method!r} failed: {e}' + raise MesonBugException(bettermsg) from e else: pkg_exc.append(None) details = d.log_details() @@ -131,7 +140,6 @@ def find_external_dependency(name: str, env: 'Environment', kwargs: DependencyOb # if the dependency was found if d.found(): - info: mlog.TV_LoggableList = [] if d.version: info.append(mlog.normal_cyan(d.version)) @@ -143,16 +151,11 @@ def find_external_dependency(name: str, env: 'Environment', kwargs: DependencyOb mlog.log(type_text, mlog.bold(display_name), details + 'found:', mlog.green('YES'), *info) return d + tried_methods.append(c.method) # otherwise, the dependency could not be found - tried_methods = [d.log_tried() for d in pkgdep if d.log_tried()] - if tried_methods: - tried = mlog.format_list(tried_methods) - else: - tried = '' - - mlog.log(type_text, mlog.bold(display_name), details + 'found:', mlog.red('NO'), - f'(tried {tried})' if tried else '') + tried = ' (tried {})'.format(mlog.format_list(tried_methods)) if tried_methods else '' + mlog.log(type_text, mlog.bold(display_name), details + 'found:', mlog.red('NO'), tried) if required: # if an exception occurred with the first detection method, re-raise it @@ -163,28 +166,27 @@ def find_external_dependency(name: str, env: 'Environment', kwargs: DependencyOb # we have a list of failed ExternalDependency objects, so we can report # the methods we tried to find the dependency - raise DependencyException(f'Dependency "{name}" not found' + - (f', tried {tried}' if tried else '')) + raise DependencyException(f'Dependency "{name}" not found' + tried) return NotFoundDependency(name, env) -def _build_external_dependency_list(name: str, env: 'Environment', for_machine: MachineChoice, - kwargs: DependencyObjectKWs) -> T.List['DependencyGenerator']: +def _build_external_dependency_list(name: str, env: 'Environment', kwargs: DependencyObjectKWs + ) -> T.List['DependencyGenerator']: # Is there a specific dependency detector for this dependency? lname = name.lower() if lname in packages: - # Create the list of dependency object constructors using a factory - # class method, if one exists, otherwise the list just consists of the - # constructor - if isinstance(packages[lname], type): - entry1 = T.cast('T.Type[ExternalDependency]', packages[lname]) # mypy doesn't understand isinstance(..., type) - if issubclass(entry1, ExternalDependency): - func: T.Callable[[], 'ExternalDependency'] = functools.partial(entry1, env, kwargs) - dep = [func] + entry = packages[lname] + if isinstance(entry, type): + if issubclass(entry, ExternalDependency): + dep = [DependencyCandidate.from_dependency(name, entry, (env, kwargs))] + else: + raise MesonBugException(f'Got an invalid type in the dependency list: {entry!r}') + elif isinstance(entry, DependencyCandidate): + entry.arguments = (env, kwargs) + dep = [entry] else: - entry2 = T.cast('T.Union[DependencyFactory, WrappedFactoryFunc]', packages[lname]) - dep = entry2(env, for_machine, kwargs) + dep = entry(env, kwargs) return dep candidates: T.List['DependencyGenerator'] = [] @@ -200,24 +202,24 @@ def _build_external_dependency_list(name: str, env: 'Environment', for_machine: # Exclusive to when it is explicitly requested if DependencyMethods.DUB in methods: from .dub import DubDependency - candidates.append(functools.partial(DubDependency, name, env, kwargs)) + candidates.append(DependencyCandidate.from_dependency(name, DubDependency, (env, kwargs))) # Preferred first candidate for auto. if DependencyMethods.PKGCONFIG in methods: from .pkgconfig import PkgConfigDependency - candidates.append(functools.partial(PkgConfigDependency, name, env, kwargs)) + candidates.append(DependencyCandidate.from_dependency(name, PkgConfigDependency, (env, kwargs))) # On OSX only, try framework dependency detector. if DependencyMethods.EXTRAFRAMEWORK in methods: - if env.machines[for_machine].is_darwin(): + if env.machines[kwargs['native']].is_darwin(): from .framework import ExtraFrameworkDependency - candidates.append(functools.partial(ExtraFrameworkDependency, name, env, kwargs)) + candidates.append(DependencyCandidate.from_dependency(name, ExtraFrameworkDependency, (env, kwargs))) # Only use CMake: # - if it's explicitly requested # - as a last resort, since it might not work 100% (see #6113) if DependencyMethods.CMAKE in methods: from .cmake import CMakeDependency - candidates.append(functools.partial(CMakeDependency, name, env, kwargs)) + candidates.append(DependencyCandidate.from_dependency(name, CMakeDependency, (env, kwargs))) return candidates diff --git a/mesonbuild/dependencies/dev.py b/mesonbuild/dependencies/dev.py index 4d219a589..4b043a07e 100644 --- a/mesonbuild/dependencies/dev.py +++ b/mesonbuild/dependencies/dev.py @@ -17,7 +17,7 @@ from mesonbuild.interpreterbase.decorators import FeatureDeprecated from .. import mesonlib, mlog from ..tooldetect import get_llvm_tool_names from ..mesonlib import version_compare, version_compare_many, search_version -from .base import DependencyException, DependencyMethods, detect_compiler, strip_system_includedirs, strip_system_libdirs, SystemDependency, ExternalDependency, DependencyTypeName +from .base import DependencyException, DependencyMethods, detect_compiler, strip_system_includedirs, strip_system_libdirs, SystemDependency, ExternalDependency, DependencyCandidate from .cmake import CMakeDependency from .configtool import ConfigToolDependency from .detect import packages @@ -48,12 +48,13 @@ def get_shared_library_suffix(environment: 'Environment', for_machine: MachineCh class GTestDependencySystem(SystemDependency): def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: - super().__init__(name, environment, kwargs, language='cpp') + kwargs['language'] = 'cpp' + super().__init__(name, environment, kwargs) self.main = kwargs.get('main', False) sysroot = environment.properties[self.for_machine].get_sys_root() or '' self.src_dirs = [sysroot + '/usr/src/gtest/src', sysroot + '/usr/src/googletest/googletest/src'] - if not self._add_sub_dependency(threads_factory(environment, self.for_machine, {})): + if not self._add_sub_dependency(threads_factory(environment, {'native': self.for_machine})): self.is_found = False return self.detect() @@ -113,22 +114,24 @@ class GTestDependencyPC(PkgConfigDependency): class GMockDependencySystem(SystemDependency): def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: - super().__init__(name, environment, kwargs, language='cpp') + kwargs['language'] = 'cpp' + super().__init__(name, environment, kwargs) self.main = kwargs.get('main', False) - if not self._add_sub_dependency(threads_factory(environment, self.for_machine, {})): + if not self._add_sub_dependency(threads_factory(environment, {'native': self.for_machine})): self.is_found = False return # If we are getting main() from GMock, we definitely # want to avoid linking in main() from GTest gtest_kwargs = kwargs.copy() + gtest_kwargs['native'] = self.for_machine if self.main: gtest_kwargs['main'] = False # GMock without GTest is pretty much useless # this also mimics the structure given in WrapDB, # where GMock always pulls in GTest - found = self._add_sub_dependency(gtest_factory(environment, self.for_machine, gtest_kwargs)) + found = self._add_sub_dependency(gtest_factory(environment, gtest_kwargs)) if not found: self.is_found = False return @@ -188,20 +191,21 @@ class LLVMDependencyConfigTool(ConfigToolDependency): __cpp_blacklist = {'-DNDEBUG'} def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs): + kwargs['language'] = 'cpp' self.tools = get_llvm_tool_names('llvm-config') # Fedora starting with Fedora 30 adds a suffix of the number # of bits in the isa that llvm targets, for example, on x86_64 # and aarch64 the name will be llvm-config-64, on x86 and arm # it will be llvm-config-32. - if environment.machines[kwargs.get('native', mesonlib.MachineChoice.HOST)].is_64_bit: + if environment.machines[kwargs['native']].is_64_bit: self.tools.append('llvm-config-64') else: self.tools.append('llvm-config-32') # It's necessary for LLVM <= 3.8 to use the C++ linker. For 3.9 and 4.0 # the C linker works fine if only using the C API. - super().__init__(name, environment, kwargs, language='cpp') + super().__init__(name, environment, kwargs) self.provided_modules: T.List[str] = [] self.required_modules: mesonlib.OrderedSet[str] = mesonlib.OrderedSet() self.module_details: T.List[str] = [] @@ -224,7 +228,7 @@ class LLVMDependencyConfigTool(ConfigToolDependency): self._set_old_link_args() self.link_args = strip_system_libdirs(environment, self.for_machine, self.link_args) self.link_args = self.__fix_bogus_link_args(self.link_args) - if not self._add_sub_dependency(threads_factory(environment, self.for_machine, {})): + if not self._add_sub_dependency(threads_factory(environment, {'native': self.for_machine})): self.is_found = False return @@ -383,18 +387,18 @@ class LLVMDependencyConfigTool(ConfigToolDependency): class LLVMDependencyCMake(CMakeDependency): def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs) -> None: + kwargs['language'] = 'cpp' self.llvm_modules = kwargs.get('modules', []) self.llvm_opt_modules = kwargs.get('optional_modules', []) - for_machine = kwargs.get('native', mesonlib.MachineChoice.HOST) + for_machine = kwargs['native'] compilers = env.coredata.compilers[for_machine] if not compilers or not {'c', 'cpp'}.issubset(compilers): # Initialize basic variables - ExternalDependency.__init__(self, DependencyTypeName('cmake'), env, kwargs) + ExternalDependency.__init__(self, name, env, kwargs) # Initialize CMake specific variables self.found_modules: T.List[str] = [] - self.name = name langs: T.List[str] = [] if not compilers: @@ -422,7 +426,7 @@ class LLVMDependencyCMake(CMakeDependency): ) return - super().__init__(name, env, kwargs, language='cpp', force_use_global_compilers=True) + super().__init__(name, env, kwargs, force_use_global_compilers=True) if not self.cmakebin.found(): return @@ -444,7 +448,7 @@ class LLVMDependencyCMake(CMakeDependency): temp = ['-I' + x for x in inc_dirs] + defs self.compile_args += [x for x in temp if x not in self.compile_args] self.compile_args = strip_system_includedirs(env, self.for_machine, self.compile_args) - if not self._add_sub_dependency(threads_factory(env, self.for_machine, {})): + if not self._add_sub_dependency(threads_factory(env, {'native': self.for_machine})): self.is_found = False return @@ -506,8 +510,8 @@ class ValgrindDependency(PkgConfigDependency): Consumers of Valgrind usually only need the compile args and do not want to link to its (static) libraries. ''' - def __init__(self, env: 'Environment', kwargs: DependencyObjectKWs): - super().__init__('valgrind', env, kwargs) + def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs): + super().__init__(name, env, kwargs) def get_link_args(self, language: T.Optional[str] = None, raw: bool = False) -> T.List[str]: return [] @@ -554,8 +558,8 @@ class ZlibSystemDependency(SystemDependency): class JNISystemDependency(SystemDependency): - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs): - super().__init__('jni', environment, kwargs) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs): + super().__init__(name, environment, kwargs) self.feature_since = ('0.62.0', '') @@ -682,8 +686,8 @@ packages['jni'] = JNISystemDependency class JDKSystemDependency(JNISystemDependency): - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs): - super().__init__(environment, kwargs) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs): + super().__init__('jni', environment, kwargs) self.feature_since = ('0.59.0', '') self.featurechecks.append(FeatureDeprecated( @@ -748,8 +752,8 @@ class DiaSDKSystemDependency(SystemDependency): defval, _ = compiler.get_define(dname, '', [], []) return defval is not None - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: - super().__init__('diasdk', environment, kwargs) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: + super().__init__(name, environment, kwargs) self.is_found = False compilers = environment.coredata.compilers.host @@ -798,27 +802,27 @@ packages['diasdk'] = DiaSDKSystemDependency packages['llvm'] = llvm_factory = DependencyFactory( 'LLVM', [DependencyMethods.CMAKE, DependencyMethods.CONFIG_TOOL], - cmake_class=LLVMDependencyCMake, - configtool_class=LLVMDependencyConfigTool, + cmake=LLVMDependencyCMake, + configtool=LLVMDependencyConfigTool, ) packages['gtest'] = gtest_factory = DependencyFactory( 'gtest', [DependencyMethods.PKGCONFIG, DependencyMethods.SYSTEM], - pkgconfig_class=GTestDependencyPC, - system_class=GTestDependencySystem, + pkgconfig=GTestDependencyPC, + system=GTestDependencySystem, ) packages['gmock'] = gmock_factory = DependencyFactory( 'gmock', [DependencyMethods.PKGCONFIG, DependencyMethods.SYSTEM], - pkgconfig_class=GMockDependencyPC, - system_class=GMockDependencySystem, + pkgconfig=GMockDependencyPC, + system=GMockDependencySystem, ) packages['zlib'] = zlib_factory = DependencyFactory( 'zlib', [DependencyMethods.PKGCONFIG, DependencyMethods.CMAKE, DependencyMethods.SYSTEM], - cmake_name='ZLIB', - system_class=ZlibSystemDependency, + cmake=DependencyCandidate.from_dependency('ZLIB', CMakeDependency), + system=ZlibSystemDependency, ) diff --git a/mesonbuild/dependencies/dub.py b/mesonbuild/dependencies/dub.py index 2166a951e..7b4daf37c 100644 --- a/mesonbuild/dependencies/dub.py +++ b/mesonbuild/dependencies/dub.py @@ -68,6 +68,8 @@ class DubDependency(ExternalDependency): class_dubbin_searched = False class_cache_dir = '' + type_name = DependencyTypeName('dub') + # Map Meson Compiler ID's to Dub Compiler ID's _ID_MAP: T.Mapping[str, str] = { 'dmd': 'dmd', @@ -76,8 +78,8 @@ class DubDependency(ExternalDependency): } def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs): - super().__init__(DependencyTypeName('dub'), environment, kwargs, language='d') - self.name = name + kwargs['language'] = 'd' + super().__init__(name, environment, kwargs) from ..compilers.d import DCompiler, d_feature_args _temp_comp = super().get_compiler() @@ -304,7 +306,7 @@ class DubDependency(ExternalDependency): for lib in bs['libs']: if os.name != 'nt': # trying to add system libraries by pkg-config - pkgdep = PkgConfigDependency(lib, environment, {'required': True, 'silent': True}) + pkgdep = PkgConfigDependency(lib, environment, {'required': True, 'silent': True, 'native': self.for_machine}) if pkgdep.is_found: for arg in pkgdep.get_compile_args(): self.compile_args.append(arg) diff --git a/mesonbuild/dependencies/factory.py b/mesonbuild/dependencies/factory.py index 0c4ca8174..02652e5e7 100644 --- a/mesonbuild/dependencies/factory.py +++ b/mesonbuild/dependencies/factory.py @@ -7,7 +7,8 @@ from __future__ import annotations import functools import typing as T -from .base import DependencyException, DependencyMethods +from ..mesonlib import MachineChoice +from .base import DependencyCandidate, DependencyException, DependencyMethods from .base import process_method_kw from .base import BuiltinDependency, SystemDependency from .cmake import CMakeDependency @@ -15,16 +16,17 @@ from .framework import ExtraFrameworkDependency from .pkgconfig import PkgConfigDependency if T.TYPE_CHECKING: - from .base import DependencyObjectKWs, ExternalDependency + from typing_extensions import TypeAlias + + from .base import DependencyObjectKWs, ExternalDependency, DepType from .configtool import ConfigToolDependency from ..environment import Environment - from ..mesonlib import MachineChoice - DependencyGenerator = T.Callable[[], ExternalDependency] + # TODO: remove this? + DependencyGenerator: TypeAlias = DependencyCandidate[ExternalDependency] FactoryFunc = T.Callable[ [ 'Environment', - MachineChoice, DependencyObjectKWs, T.List[DependencyMethods] ], @@ -34,16 +36,11 @@ if T.TYPE_CHECKING: WrappedFactoryFunc = T.Callable[ [ 'Environment', - MachineChoice, DependencyObjectKWs, ], T.List[DependencyGenerator] ] - # This should be str, Environment, T.Dict[str, T.Any], T.Optional[str] - # But if you try that, you get error: Cannot infer type of lambda - CmakeDependencyFunc = T.Callable[..., CMakeDependency] - class DependencyFactory: """Factory to get dependencies from multiple sources. @@ -52,53 +49,61 @@ class DependencyFactory: for various kinds of dependencies. When the initialized object is called it returns a list of callables return Dependency objects to try in order. - :name: The name of the dependency. This will be passed as the name + :param name: The name of the dependency. This will be passed as the name parameter of the each dependency unless it is overridden on a per type basis. - :methods: An ordered list of DependencyMethods. This is the order + :param methods: An ordered list of DependencyMethods. This is the order dependencies will be returned in unless they are removed by the _process_method function - :*_name: This will overwrite the name passed to the corresponding class. - For example, if the name is 'zlib', but cmake calls the dependency - 'Z', then using `cmake_name='Z'` will pass the name as 'Z' to cmake. - :*_class: A *type* or callable that creates a class, and has the - signature of an ExternalDependency - :system_class: If you pass DependencyMethods.SYSTEM in methods, you must - set this argument. + :param extra_kwargs: Additional keyword arguments to add when creating the + DependencyCandidate + :param pkgconfig: A custom PackageConfig lookup to use + :param cmake: A custom CMake lookup to use + :param framework: A custom AppleFramework lookup to use + :param configtool: A custom ConfigTool lookup to use. If + DependencyMethods.CONFIG_TOOL is in the `:param:methods` argument, + this must be set. + :param builtin: A custom Builtin lookup to use. If + DependencyMethods.BUILTIN is in the `:param:methods` argument, + this must be set. + :param system: A custom System lookup to use. If + DependencyMethods.SYSTEM is in the `:param:methods` argument, + this must be set. """ def __init__(self, name: str, methods: T.List[DependencyMethods], *, extra_kwargs: T.Optional[DependencyObjectKWs] = None, - pkgconfig_name: T.Optional[str] = None, - pkgconfig_class: 'T.Type[PkgConfigDependency]' = PkgConfigDependency, - cmake_name: T.Optional[str] = None, - cmake_class: 'T.Union[T.Type[CMakeDependency], CmakeDependencyFunc]' = CMakeDependency, - configtool_class: 'T.Optional[T.Type[ConfigToolDependency]]' = None, - framework_name: T.Optional[str] = None, - framework_class: 'T.Type[ExtraFrameworkDependency]' = ExtraFrameworkDependency, - builtin_class: 'T.Type[BuiltinDependency]' = BuiltinDependency, - system_class: 'T.Type[SystemDependency]' = SystemDependency): - - if DependencyMethods.CONFIG_TOOL in methods and not configtool_class: - raise DependencyException('A configtool must have a custom class') - - self.extra_kwargs = extra_kwargs or {} + pkgconfig: T.Union[DependencyCandidate[PkgConfigDependency], T.Type[PkgConfigDependency], None] = PkgConfigDependency, + cmake: T.Union[DependencyCandidate[CMakeDependency], T.Type[CMakeDependency], None] = CMakeDependency, + framework: T.Union[DependencyCandidate[ExtraFrameworkDependency], T.Type[ExtraFrameworkDependency], None] = ExtraFrameworkDependency, + configtool: T.Union[DependencyCandidate[ConfigToolDependency], T.Type[ConfigToolDependency], None] = None, + builtin: T.Union[DependencyCandidate[BuiltinDependency], T.Type[BuiltinDependency], None] = None, + system: T.Union[DependencyCandidate[SystemDependency], T.Type[SystemDependency], None] = None): + + if DependencyMethods.CONFIG_TOOL in methods and not configtool: + raise DependencyException('A configtool dependency must have a custom class') + if DependencyMethods.BUILTIN in methods and not builtin: + raise DependencyException('A builtin dependency must have a custom class') + if DependencyMethods.SYSTEM in methods and not system: + raise DependencyException('A system dependency must have a custom class') + + def make(arg: T.Union[DependencyCandidate[DepType], T.Type[DepType], None]) -> T.Optional[DependencyCandidate[DepType]]: + if arg is None or isinstance(arg, DependencyCandidate): + return arg + return DependencyCandidate.from_dependency(name, arg) + + self.extra_kwargs = extra_kwargs self.methods = methods - self.classes: T.Dict[ - DependencyMethods, - T.Callable[['Environment', DependencyObjectKWs], ExternalDependency] - ] = { + self.classes: T.Mapping[DependencyMethods, T.Optional[DependencyCandidate[ExternalDependency]]] = { # Just attach the correct name right now, either the generic name # or the method specific name. - DependencyMethods.EXTRAFRAMEWORK: functools.partial(framework_class, framework_name or name), - DependencyMethods.PKGCONFIG: functools.partial(pkgconfig_class, pkgconfig_name or name), - DependencyMethods.CMAKE: functools.partial(cmake_class, cmake_name or name), - DependencyMethods.SYSTEM: functools.partial(system_class, name), - DependencyMethods.BUILTIN: functools.partial(builtin_class, name), - DependencyMethods.CONFIG_TOOL: None, + DependencyMethods.EXTRAFRAMEWORK: make(framework), + DependencyMethods.PKGCONFIG: make(pkgconfig), + DependencyMethods.CMAKE: make(cmake), + DependencyMethods.SYSTEM: make(system), + DependencyMethods.BUILTIN: make(builtin), + DependencyMethods.CONFIG_TOOL: make(configtool), } - if configtool_class is not None: - self.classes[DependencyMethods.CONFIG_TOOL] = functools.partial(configtool_class, name) @staticmethod def _process_method(method: DependencyMethods, env: 'Environment', for_machine: MachineChoice) -> bool: @@ -115,15 +120,24 @@ class DependencyFactory: return False return True - def __call__(self, env: 'Environment', for_machine: MachineChoice, - kwargs: DependencyObjectKWs) -> T.List['DependencyGenerator']: + def __call__(self, env: 'Environment', kwargs: DependencyObjectKWs) -> T.List['DependencyGenerator']: """Return a list of Dependencies with the arguments already attached.""" methods = process_method_kw(self.methods, kwargs) - nwargs = self.extra_kwargs.copy() - nwargs.update(kwargs) - - return [functools.partial(self.classes[m], env, nwargs) for m in methods - if self._process_method(m, env, for_machine)] + if self.extra_kwargs: + nwargs = self.extra_kwargs.copy() + nwargs.update(kwargs) + else: + nwargs = kwargs.copy() + + ret: T.List[DependencyGenerator] = [] + for m in methods: + if self._process_method(m, env, kwargs['native']): + c = self.classes[m] + if c is None: + continue + c.arguments = (env, nwargs) + ret.append(c) + return ret def factory_methods(methods: T.Set[DependencyMethods]) -> T.Callable[['FactoryFunc'], 'WrappedFactoryFunc']: @@ -138,8 +152,8 @@ def factory_methods(methods: T.Set[DependencyMethods]) -> T.Callable[['FactoryFu def inner(func: 'FactoryFunc') -> 'WrappedFactoryFunc': @functools.wraps(func) - def wrapped(env: 'Environment', for_machine: MachineChoice, kwargs: DependencyObjectKWs) -> T.List['DependencyGenerator']: - return func(env, for_machine, kwargs, process_method_kw(methods, kwargs)) + def wrapped(env: 'Environment', kwargs: DependencyObjectKWs) -> T.List['DependencyGenerator']: + return func(env, kwargs, process_method_kw(methods, kwargs)) return wrapped diff --git a/mesonbuild/dependencies/framework.py b/mesonbuild/dependencies/framework.py index a23b4a66b..e3ebd3b4a 100644 --- a/mesonbuild/dependencies/framework.py +++ b/mesonbuild/dependencies/framework.py @@ -4,7 +4,7 @@ from __future__ import annotations from .base import DependencyTypeName, ExternalDependency, DependencyException -from ..mesonlib import MesonException, Version, stringlistify +from ..mesonlib import MesonException, Version from .. import mlog from pathlib import Path import typing as T @@ -16,10 +16,11 @@ if T.TYPE_CHECKING: class ExtraFrameworkDependency(ExternalDependency): system_framework_paths: T.Optional[T.List[str]] = None - def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs, language: T.Optional[str] = None) -> None: - paths = stringlistify(kwargs.get('paths', [])) - super().__init__(DependencyTypeName('extraframeworks'), env, kwargs, language=language) - self.name = name + type_name = DependencyTypeName('extraframeworks') + + def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs) -> None: + paths = kwargs.get('paths', []) + super().__init__(name, env, kwargs) # Full path to framework directory self.framework_path: T.Optional[str] = None if not self.clib_compiler: @@ -111,7 +112,3 @@ class ExtraFrameworkDependency(ExternalDependency): def log_info(self) -> str: return self.framework_path or '' - - @staticmethod - def log_tried() -> str: - return 'framework' diff --git a/mesonbuild/dependencies/hdf5.py b/mesonbuild/dependencies/hdf5.py index 5894cfb46..2c020669a 100644 --- a/mesonbuild/dependencies/hdf5.py +++ b/mesonbuild/dependencies/hdf5.py @@ -4,13 +4,12 @@ # This file contains the detection logic for miscellaneous external dependencies. from __future__ import annotations -import functools import os import re from pathlib import Path -from ..mesonlib import OrderedSet, join_args, MachineChoice -from .base import DependencyException, DependencyMethods +from ..mesonlib import OrderedSet, join_args +from .base import DependencyCandidate, DependencyException, DependencyMethods from .configtool import ConfigToolDependency from .detect import packages from .pkgconfig import PkgConfigDependency, PkgConfigInterface @@ -27,12 +26,12 @@ class HDF5PkgConfigDependency(PkgConfigDependency): """Handle brokenness in the HDF5 pkg-config files.""" - def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs, language: T.Optional[str] = None) -> None: - language = language or 'c' + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: + language = kwargs.get('language') or 'c' if language not in {'c', 'cpp', 'fortran'}: raise DependencyException(f'Language {language} is not supported with HDF5.') - super().__init__(name, environment, kwargs, language) + super().__init__(name, environment, kwargs) if not self.is_found: return @@ -78,8 +77,8 @@ class HDF5ConfigToolDependency(ConfigToolDependency): version_arg = '-showconfig' - def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs, language: T.Optional[str] = None) -> None: - language = language or 'c' + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: + language = kwargs.get('language') or 'c' if language not in {'c', 'cpp', 'fortran'}: raise DependencyException(f'Language {language} is not supported with HDF5.') @@ -98,20 +97,19 @@ class HDF5ConfigToolDependency(ConfigToolDependency): else: raise DependencyException('How did you get here?') - # We need this before we call super() - for_machine = kwargs.get('native', MachineChoice.HOST) - nkwargs = kwargs.copy() nkwargs['tools'] = tools # Override the compiler that the config tools are going to use by # setting the environment variables that they use for the compiler and # linkers. + + for_machine = kwargs['native'] compiler = environment.coredata.compilers[for_machine][language] try: os.environ[f'HDF5_{cenv}'] = join_args(compiler.get_exelist()) os.environ[f'HDF5_{lenv}LINKER'] = join_args(compiler.get_linker_exelist()) - super().__init__(name, environment, nkwargs, language) + super().__init__(name, environment, nkwargs) finally: del os.environ[f'HDF5_{cenv}'] del os.environ[f'HDF5_{lenv}LINKER'] @@ -143,10 +141,10 @@ class HDF5ConfigToolDependency(ConfigToolDependency): @factory_methods({DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL}) -def hdf5_factory(env: 'Environment', for_machine: 'MachineChoice', - kwargs: DependencyObjectKWs, methods: T.List[DependencyMethods]) -> T.List['DependencyGenerator']: - language = kwargs.get('language') +def hdf5_factory(env: 'Environment', kwargs: DependencyObjectKWs, + methods: T.List[DependencyMethods]) -> T.List['DependencyGenerator']: candidates: T.List['DependencyGenerator'] = [] + for_machine = kwargs['native'] if DependencyMethods.PKGCONFIG in methods: # Use an ordered set so that these remain the first tried pkg-config files @@ -162,10 +160,12 @@ def hdf5_factory(env: 'Environment', for_machine: 'MachineChoice', # use just the standard files if pkg-config --list-all fails pass for mod in pkgconfig_files: - candidates.append(functools.partial(HDF5PkgConfigDependency, mod, env, kwargs, language)) + candidates.append(DependencyCandidate.from_dependency( + mod, HDF5PkgConfigDependency, (env, kwargs))) if DependencyMethods.CONFIG_TOOL in methods: - candidates.append(functools.partial(HDF5ConfigToolDependency, 'hdf5', env, kwargs, language)) + candidates.append(DependencyCandidate.from_dependency( + 'hd5f', HDF5ConfigToolDependency, (env, kwargs))) return candidates diff --git a/mesonbuild/dependencies/misc.py b/mesonbuild/dependencies/misc.py index b1b8b8e13..1f9f06e97 100644 --- a/mesonbuild/dependencies/misc.py +++ b/mesonbuild/dependencies/misc.py @@ -4,15 +4,14 @@ # This file contains the detection logic for miscellaneous external dependencies. from __future__ import annotations -import functools import re import typing as T from .. import mesonlib from .. import mlog -from .base import DependencyException, DependencyMethods +from .base import DependencyCandidate, DependencyException, DependencyMethods from .base import BuiltinDependency, SystemDependency -from .cmake import CMakeDependency, CMakeDependencyFactory +from .cmake import CMakeDependency from .configtool import ConfigToolDependency from .detect import packages from .factory import DependencyFactory, factory_methods @@ -27,7 +26,6 @@ if T.TYPE_CHECKING: @factory_methods({DependencyMethods.PKGCONFIG, DependencyMethods.CMAKE}) def netcdf_factory(env: 'Environment', - for_machine: 'mesonlib.MachineChoice', kwargs: DependencyObjectKWs, methods: T.List[DependencyMethods]) -> T.List['DependencyGenerator']: language = kwargs.get('language') @@ -44,10 +42,12 @@ def netcdf_factory(env: 'Environment', else: pkg = 'netcdf' - candidates.append(functools.partial(PkgConfigDependency, pkg, env, kwargs, language=language)) + candidates.append(DependencyCandidate.from_dependency( + pkg, PkgConfigDependency, (env, kwargs))) if DependencyMethods.CMAKE in methods: - candidates.append(functools.partial(CMakeDependency, 'NetCDF', env, kwargs, language=language)) + candidates.append(DependencyCandidate.from_dependency( + 'NetCDF', CMakeDependency, (env, kwargs))) return candidates @@ -113,9 +113,8 @@ class OpenMPDependency(SystemDependency): '199810': '1.0', } - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: - language = kwargs.get('language') - super().__init__('openmp', environment, kwargs, language=language) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: + super().__init__(name, environment, kwargs) self.is_found = False if self.clib_compiler.get_id() == 'nagfor': # No macro defined for OpenMP, but OpenMP 3.1 is supported. @@ -181,8 +180,8 @@ class ThreadDependency(SystemDependency): class BlocksDependency(SystemDependency): - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: - super().__init__('blocks', environment, kwargs) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: + super().__init__(name, environment, kwargs) self.name = 'blocks' self.is_found = False @@ -304,8 +303,8 @@ class GpgmeDependencyConfigTool(ConfigToolDependency): class ShadercDependency(SystemDependency): - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs): - super().__init__('shaderc', environment, kwargs) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs): + super().__init__(name, environment, kwargs) static_lib = 'shaderc_combined' shared_lib = 'shaderc_shared' @@ -336,7 +335,7 @@ class CursesConfigToolDependency(ConfigToolDependency): # ncurses5.4-config is for macOS Catalina tools = ['ncursesw6-config', 'ncursesw5-config', 'ncurses6-config', 'ncurses5-config', 'ncurses5.4-config'] - def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs, language: T.Optional[str] = None): + def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs): exclude_paths = None # macOS mistakenly ships /usr/bin/ncurses5.4-config and a man page for # it, but none of the headers or libraries. Ignore /usr/bin because it @@ -344,7 +343,7 @@ class CursesConfigToolDependency(ConfigToolDependency): # Homebrew is /usr/local or /opt/homebrew. if env.machines.build and env.machines.build.system == 'darwin': exclude_paths = ['/usr/bin'] - super().__init__(name, env, kwargs, language, exclude_paths=exclude_paths) + super().__init__(name, env, kwargs, exclude_paths=exclude_paths) if not self.is_found: return self.compile_args = self.get_config_value(['--cflags'], 'compile_args') @@ -450,7 +449,7 @@ class IntlSystemDependency(SystemDependency): self.is_found = True if self.static: - if not self._add_sub_dependency(iconv_factory(env, self.for_machine, {'static': True})): + if not self._add_sub_dependency(iconv_factory(env, {'static': True, 'native': self.for_machine})): self.is_found = False @@ -461,6 +460,7 @@ class OpensslSystemDependency(SystemDependency): dependency_kwargs: DependencyObjectKWs = { 'method': DependencyMethods.SYSTEM, 'static': self.static, + 'native': kwargs.get('native'), } if not self.clib_compiler.has_header('openssl/ssl.h', '')[0]: return @@ -478,8 +478,8 @@ class OpensslSystemDependency(SystemDependency): self.version = '.'.join(str(i) for i in version_ints[:3]) + chr(ord('a') + version_ints[3] - 1) if name == 'openssl': - if self._add_sub_dependency(libssl_factory(env, self.for_machine, dependency_kwargs)) and \ - self._add_sub_dependency(libcrypto_factory(env, self.for_machine, dependency_kwargs)): + if self._add_sub_dependency(libssl_factory(env, dependency_kwargs)) and \ + self._add_sub_dependency(libcrypto_factory(env, dependency_kwargs)): self.is_found = True return else: @@ -491,11 +491,11 @@ class OpensslSystemDependency(SystemDependency): self.is_found = True else: if name == 'libssl': - if self._add_sub_dependency(libcrypto_factory(env, self.for_machine, dependency_kwargs)): + if self._add_sub_dependency(libcrypto_factory(env, dependency_kwargs)): self.is_found = True elif name == 'libcrypto': use_threads = self.clib_compiler.has_header_symbol('openssl/opensslconf.h', 'OPENSSL_THREADS', '', dependencies=[self])[0] - if not use_threads or self._add_sub_dependency(threads_factory(env, self.for_machine, {})): + if not use_threads or self._add_sub_dependency(threads_factory(env, {'native': self.for_machine})): self.is_found = True # only relevant on platforms where it is distributed with the libc, in which case it always succeeds sublib = self.clib_compiler.find_library('dl', [], self.libtype) @@ -508,8 +508,8 @@ class ObjFWDependency(ConfigToolDependency): tools = ['objfw-config'] tool_name = 'objfw-config' - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs): - super().__init__('objfw', environment, kwargs) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs): + super().__init__(name, environment, kwargs) self.feature_since = ('1.5.0', '') if not self.is_found: return @@ -529,25 +529,28 @@ class ObjFWDependency(ConfigToolDependency): @factory_methods({DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL, DependencyMethods.SYSTEM}) def curses_factory(env: 'Environment', - for_machine: 'mesonlib.MachineChoice', kwargs: DependencyObjectKWs, methods: T.List[DependencyMethods]) -> T.List['DependencyGenerator']: candidates: T.List['DependencyGenerator'] = [] + for_machine = kwargs['native'] if DependencyMethods.PKGCONFIG in methods: pkgconfig_files = ['pdcurses', 'ncursesw', 'ncurses', 'curses'] for pkg in pkgconfig_files: - candidates.append(functools.partial(PkgConfigDependency, pkg, env, kwargs)) + candidates.append(DependencyCandidate.from_dependency( + pkg, PkgConfigDependency, (env, kwargs))) # There are path handling problems with these methods on msys, and they # don't apply to windows otherwise (cygwin is handled separately from # windows) if not env.machines[for_machine].is_windows(): if DependencyMethods.CONFIG_TOOL in methods: - candidates.append(functools.partial(CursesConfigToolDependency, 'curses', env, kwargs)) + candidates.append(DependencyCandidate.from_dependency( + 'curses', CursesConfigToolDependency, (env, kwargs))) if DependencyMethods.SYSTEM in methods: - candidates.append(functools.partial(CursesSystemDependency, 'curses', env, kwargs)) + candidates.append(DependencyCandidate.from_dependency( + 'curses', CursesSystemDependency, (env, kwargs))) return candidates packages['curses'] = curses_factory @@ -555,7 +558,6 @@ packages['curses'] = curses_factory @factory_methods({DependencyMethods.PKGCONFIG, DependencyMethods.SYSTEM}) def shaderc_factory(env: 'Environment', - for_machine: 'mesonlib.MachineChoice', kwargs: DependencyObjectKWs, methods: T.List[DependencyMethods]) -> T.List['DependencyGenerator']: """Custom DependencyFactory for ShaderC. @@ -578,15 +580,16 @@ def shaderc_factory(env: 'Environment', if static is None: static = T.cast('bool', env.coredata.optstore.get_value_for(OptionKey('prefer_static'))) if static: - c = [functools.partial(PkgConfigDependency, name, env, kwargs) + c = [DependencyCandidate.from_dependency(name, PkgConfigDependency, (env, kwargs)) for name in static_libs + shared_libs] else: - c = [functools.partial(PkgConfigDependency, name, env, kwargs) + c = [DependencyCandidate.from_dependency(name, PkgConfigDependency, (env, kwargs)) for name in shared_libs + static_libs] candidates.extend(c) if DependencyMethods.SYSTEM in methods: - candidates.append(functools.partial(ShadercDependency, env, kwargs)) + candidates.append(DependencyCandidate.from_dependency( + 'shaderc', ShadercDependency, (env, kwargs))) return candidates packages['shaderc'] = shaderc_factory @@ -595,89 +598,89 @@ packages['shaderc'] = shaderc_factory packages['atomic'] = atomic_factory = DependencyFactory( 'atomic', [DependencyMethods.SYSTEM, DependencyMethods.BUILTIN], - system_class=AtomicSystemDependency, - builtin_class=AtomicBuiltinDependency, + system=AtomicSystemDependency, + builtin=AtomicBuiltinDependency, ) packages['cups'] = cups_factory = DependencyFactory( 'cups', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL, DependencyMethods.EXTRAFRAMEWORK, DependencyMethods.CMAKE], - configtool_class=CupsDependencyConfigTool, - cmake_name='Cups', + configtool=CupsDependencyConfigTool, + cmake=DependencyCandidate.from_dependency('Cups', CMakeDependency), ) packages['dl'] = dl_factory = DependencyFactory( 'dl', [DependencyMethods.BUILTIN, DependencyMethods.SYSTEM], - builtin_class=DlBuiltinDependency, - system_class=DlSystemDependency, + builtin=DlBuiltinDependency, + system=DlSystemDependency, ) packages['gpgme'] = gpgme_factory = DependencyFactory( 'gpgme', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL], - configtool_class=GpgmeDependencyConfigTool, + configtool=GpgmeDependencyConfigTool, ) packages['libgcrypt'] = libgcrypt_factory = DependencyFactory( 'libgcrypt', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL], - configtool_class=LibGCryptDependencyConfigTool, + configtool=LibGCryptDependencyConfigTool, ) packages['libwmf'] = libwmf_factory = DependencyFactory( 'libwmf', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL], - configtool_class=LibWmfDependencyConfigTool, + configtool=LibWmfDependencyConfigTool, ) packages['pcap'] = pcap_factory = DependencyFactory( 'pcap', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL], - configtool_class=PcapDependencyConfigTool, - pkgconfig_name='libpcap', + configtool=PcapDependencyConfigTool, + pkgconfig=DependencyCandidate.from_dependency('libpcap', PkgConfigDependency), ) packages['threads'] = threads_factory = DependencyFactory( 'threads', [DependencyMethods.SYSTEM, DependencyMethods.CMAKE], - cmake_name='Threads', - system_class=ThreadDependency, + cmake=DependencyCandidate.from_dependency('Threads', CMakeDependency), + system=ThreadDependency, ) packages['iconv'] = iconv_factory = DependencyFactory( 'iconv', [DependencyMethods.BUILTIN, DependencyMethods.SYSTEM], - builtin_class=IconvBuiltinDependency, - system_class=IconvSystemDependency, + builtin=IconvBuiltinDependency, + system=IconvSystemDependency, ) packages['intl'] = intl_factory = DependencyFactory( 'intl', [DependencyMethods.BUILTIN, DependencyMethods.SYSTEM], - builtin_class=IntlBuiltinDependency, - system_class=IntlSystemDependency, + builtin=IntlBuiltinDependency, + system=IntlSystemDependency, ) packages['openssl'] = openssl_factory = DependencyFactory( 'openssl', [DependencyMethods.PKGCONFIG, DependencyMethods.SYSTEM, DependencyMethods.CMAKE], - system_class=OpensslSystemDependency, - cmake_class=CMakeDependencyFactory('OpenSSL', modules=['OpenSSL::Crypto', 'OpenSSL::SSL']), + system=OpensslSystemDependency, + cmake=DependencyCandidate.from_dependency('OpenSSL', CMakeDependency, modules=['OpenSSL::Crypto', 'OpenSSL::SSL']), ) packages['libcrypto'] = libcrypto_factory = DependencyFactory( 'libcrypto', [DependencyMethods.PKGCONFIG, DependencyMethods.SYSTEM, DependencyMethods.CMAKE], - system_class=OpensslSystemDependency, - cmake_class=CMakeDependencyFactory('OpenSSL', modules=['OpenSSL::Crypto']), + system=OpensslSystemDependency, + cmake=DependencyCandidate.from_dependency('OpenSSL', CMakeDependency, modules=['OpenSSL::Crypto']), ) packages['libssl'] = libssl_factory = DependencyFactory( 'libssl', [DependencyMethods.PKGCONFIG, DependencyMethods.SYSTEM, DependencyMethods.CMAKE], - system_class=OpensslSystemDependency, - cmake_class=CMakeDependencyFactory('OpenSSL', modules=['OpenSSL::SSL']), + system=OpensslSystemDependency, + cmake=DependencyCandidate.from_dependency('OpenSSL', CMakeDependency, modules=['OpenSSL::SSL']), ) packages['objfw'] = ObjFWDependency diff --git a/mesonbuild/dependencies/mpi.py b/mesonbuild/dependencies/mpi.py index 8a0ac0a9c..3f61d0730 100644 --- a/mesonbuild/dependencies/mpi.py +++ b/mesonbuild/dependencies/mpi.py @@ -3,14 +3,13 @@ from __future__ import annotations -import functools import typing as T import os import re from ..envconfig import detect_cpu_family from ..mesonlib import Popen_safe -from .base import DependencyException, DependencyMethods, detect_compiler, SystemDependency +from .base import DependencyCandidate, DependencyException, DependencyMethods, detect_compiler, SystemDependency from .configtool import ConfigToolDependency from .detect import packages from .factory import factory_methods @@ -19,22 +18,20 @@ from .pkgconfig import PkgConfigDependency if T.TYPE_CHECKING: from .factory import DependencyGenerator from ..environment import Environment - from ..mesonlib import MachineChoice from .base import DependencyObjectKWs @factory_methods({DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL, DependencyMethods.SYSTEM}) def mpi_factory(env: 'Environment', - for_machine: 'MachineChoice', kwargs: DependencyObjectKWs, methods: T.List[DependencyMethods]) -> T.List['DependencyGenerator']: - language = kwargs.get('language') - if language is None: - language = 'c' + language = kwargs.get('language') or 'c' if language not in {'c', 'cpp', 'fortran'}: # OpenMPI doesn't work without any other languages return [] + for_machine = kwargs['native'] + candidates: T.List['DependencyGenerator'] = [] compiler = detect_compiler('mpi', env, for_machine, language) if not compiler: @@ -77,14 +74,14 @@ def mpi_factory(env: 'Environment', tool_names.extend(['mpifort', 'mpif90', 'mpif77']) nwargs['tools'] = tool_names - candidates.append(functools.partial( - MPIConfigToolDependency, tool_names[0], env, nwargs, language=language)) + candidates.append(DependencyCandidate.from_dependency( + tool_names[0], MPIConfigToolDependency, (env, nwargs))) if DependencyMethods.SYSTEM in methods and env.machines[for_machine].is_windows(): - candidates.append(functools.partial( - MSMPIDependency, 'msmpi', env, kwargs, language=language)) - candidates.append(functools.partial( - IMPIDependency, 'impi', env, kwargs, language=language)) + candidates.append(DependencyCandidate.from_dependency( + 'msmpi', MSMPIDependency, (env, kwargs))) + candidates.append(DependencyCandidate.from_dependency( + 'impi', IMPIDependency, (env, kwargs))) # Only OpenMPI has pkg-config, and it doesn't work with the intel compilers # for MPI, environment variables and commands like mpicc should have priority @@ -96,8 +93,8 @@ def mpi_factory(env: 'Environment', pkg_name = 'ompi-cxx' elif language == 'fortran': pkg_name = 'ompi-fort' - candidates.append(functools.partial( - PkgConfigDependency, pkg_name, env, kwargs, language=language)) + candidates.append(DependencyCandidate.from_dependency( + pkg_name, PkgConfigDependency, (env, kwargs))) return candidates @@ -107,9 +104,8 @@ packages['mpi'] = mpi_factory class MPIConfigToolDependency(ConfigToolDependency): """Wrapper around mpicc, Intel's mpiicc and friends.""" - def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs, - language: T.Optional[str] = None): - super().__init__(name, env, kwargs, language=language) + def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs): + super().__init__(name, env, kwargs) if not self.is_found: return @@ -217,11 +213,10 @@ class MSMPIDependency(SystemDependency): """The Microsoft MPI.""" - def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs, - language: T.Optional[str] = None): - super().__init__(name, env, kwargs, language=language) + def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs): + super().__init__(name, env, kwargs) # MSMPI only supports the C API - if language not in {'c', 'fortran', None}: + if self.language not in {'c', 'fortran', None}: self.is_found = False return # MSMPI is only for windows, obviously @@ -253,9 +248,8 @@ class IMPIDependency(SystemDependency): """Intel(R) MPI for Windows.""" - def __init__(self, name: str, env: Environment, kwargs: DependencyObjectKWs, - language: T.Optional[str] = None): - super().__init__(name, env, kwargs, language=language) + def __init__(self, name: str, env: Environment, kwargs: DependencyObjectKWs): + super().__init__(name, env, kwargs) # only for windows if not self.env.machines[self.for_machine].is_windows(): return diff --git a/mesonbuild/dependencies/pkgconfig.py b/mesonbuild/dependencies/pkgconfig.py index b628e005b..2a4e91d8d 100644 --- a/mesonbuild/dependencies/pkgconfig.py +++ b/mesonbuild/dependencies/pkgconfig.py @@ -306,11 +306,11 @@ class PkgConfigCLI(PkgConfigInterface): class PkgConfigDependency(ExternalDependency): + type_name = DependencyTypeName('pkgconfig') + def __init__(self, name: str, environment: Environment, kwargs: DependencyObjectKWs, - language: T.Optional[str] = None, extra_paths: T.Optional[T.List[str]] = None) -> None: - super().__init__(DependencyTypeName('pkgconfig'), environment, kwargs, language=language) - self.name = name + super().__init__(name, environment, kwargs) self.is_libtool = False self.extra_paths = extra_paths or [] pkgconfig = PkgConfigInterface.instance(self.env, self.for_machine, self.silent, self.extra_paths) @@ -585,10 +585,6 @@ class PkgConfigDependency(ExternalDependency): # a path rather than the raw dlname return os.path.basename(dlname) - @staticmethod - def log_tried() -> str: - return 'pkgconfig' - def get_variable(self, *, cmake: T.Optional[str] = None, pkgconfig: T.Optional[str] = None, configtool: T.Optional[str] = None, internal: T.Optional[str] = None, system: T.Optional[str] = None, default_value: T.Optional[str] = None, diff --git a/mesonbuild/dependencies/platform.py b/mesonbuild/dependencies/platform.py index 49ec980b2..829220ccb 100644 --- a/mesonbuild/dependencies/platform.py +++ b/mesonbuild/dependencies/platform.py @@ -15,8 +15,11 @@ if T.TYPE_CHECKING: from .base import DependencyObjectKWs class AppleFrameworks(ExternalDependency): - def __init__(self, env: 'Environment', kwargs: DependencyObjectKWs) -> None: - super().__init__(DependencyTypeName('appleframeworks'), env, kwargs) + + type_name = DependencyTypeName('appleframeworks') + + def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs) -> None: + super().__init__(name, env, kwargs) modules = kwargs.get('modules', []) if not modules: raise DependencyException("AppleFrameworks dependency requires at least one module.") @@ -44,8 +47,4 @@ class AppleFrameworks(ExternalDependency): def log_info(self) -> str: return ', '.join(self.frameworks) - @staticmethod - def log_tried() -> str: - return 'framework' - packages['appleframeworks'] = AppleFrameworks diff --git a/mesonbuild/dependencies/python.py b/mesonbuild/dependencies/python.py index aa2e22c6f..0e40d481f 100644 --- a/mesonbuild/dependencies/python.py +++ b/mesonbuild/dependencies/python.py @@ -8,13 +8,14 @@ from pathlib import Path import typing as T from .. import mesonlib, mlog -from .base import process_method_kw, DependencyException, DependencyMethods, ExternalDependency, SystemDependency +from .base import process_method_kw, DependencyCandidate, DependencyException, DependencyMethods, ExternalDependency, SystemDependency from .configtool import ConfigToolDependency from .detect import packages from .factory import DependencyFactory from .framework import ExtraFrameworkDependency from .pkgconfig import PkgConfigDependency from ..envconfig import detect_cpu_family +from ..mesonlib import MachineChoice from ..programs import ExternalProgram from ..options import OptionKey @@ -23,7 +24,6 @@ if T.TYPE_CHECKING: from .factory import DependencyGenerator from ..environment import Environment - from ..mesonlib import MachineChoice from .base import DependencyObjectKWs class PythonIntrospectionDict(TypedDict): @@ -248,9 +248,9 @@ class BasicPythonExternalProgram(ExternalProgram): class _PythonDependencyBase(_Base): - def __init__(self, python_holder: 'BasicPythonExternalProgram', embed: bool, - for_machine: 'MachineChoice'): - self.for_machine = for_machine + for_machine: MachineChoice + + def __init__(self, python_holder: 'BasicPythonExternalProgram', embed: bool): self.embed = embed self.build_config = python_holder.build_config @@ -457,9 +457,10 @@ class _PythonDependencyBase(_Base): class PythonPkgConfigDependency(PkgConfigDependency, _PythonDependencyBase): - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs, - installation: 'BasicPythonExternalProgram', embed: bool, - for_machine: 'MachineChoice'): + # name is needed for polymorphism + def __init__(self, name: str, environment: Environment, kwargs: DependencyObjectKWs, + installation: 'BasicPythonExternalProgram'): + embed = kwargs.get('embed', False) pkg_embed = '-embed' if embed and mesonlib.version_compare(installation.info['version'], '>=3.8') else '' pkg_name = f'python-{installation.version}{pkg_embed}' @@ -477,6 +478,7 @@ class PythonPkgConfigDependency(PkgConfigDependency, _PythonDependencyBase): self.is_found = False return + for_machine = kwargs['native'] sysroot = environment.properties[for_machine].get_sys_root() or '' pkg_libdir = sysroot + pkg_libdir @@ -484,7 +486,7 @@ class PythonPkgConfigDependency(PkgConfigDependency, _PythonDependencyBase): pkgconfig_paths = [pkg_libdir] if pkg_libdir else [] PkgConfigDependency.__init__(self, pkg_name, environment, kwargs, extra_paths=pkgconfig_paths) - _PythonDependencyBase.__init__(self, installation, kwargs.get('embed', False), for_machine) + _PythonDependencyBase.__init__(self, installation, embed) if pkg_libdir and not self.is_found: mlog.debug(f'{pkg_name!r} could not be found in {pkg_libdir_origin}, ' @@ -509,19 +511,17 @@ class PythonPkgConfigDependency(PkgConfigDependency, _PythonDependencyBase): class PythonFrameworkDependency(ExtraFrameworkDependency, _PythonDependencyBase): def __init__(self, name: str, environment: 'Environment', - kwargs: DependencyObjectKWs, installation: 'BasicPythonExternalProgram', - for_machine: 'MachineChoice'): + kwargs: DependencyObjectKWs, installation: 'BasicPythonExternalProgram'): ExtraFrameworkDependency.__init__(self, name, environment, kwargs) - _PythonDependencyBase.__init__(self, installation, kwargs.get('embed', False), for_machine) + _PythonDependencyBase.__init__(self, installation, kwargs.get('embed', False)) class PythonSystemDependency(SystemDependency, _PythonDependencyBase): def __init__(self, name: str, environment: 'Environment', - kwargs: DependencyObjectKWs, installation: 'BasicPythonExternalProgram', - for_machine: 'MachineChoice'): + kwargs: DependencyObjectKWs, installation: BasicPythonExternalProgram): SystemDependency.__init__(self, name, environment, kwargs) - _PythonDependencyBase.__init__(self, installation, kwargs.get('embed', False), for_machine) + _PythonDependencyBase.__init__(self, installation, kwargs.get('embed', False)) # For most platforms, match pkg-config behavior. iOS is a special case; # check for that first, so that check takes priority over @@ -541,7 +541,7 @@ class PythonSystemDependency(SystemDependency, _PythonDependencyBase): # compile args if self.build_config: - sysroot = environment.properties[for_machine].get_sys_root() or '' + sysroot = environment.properties[self.for_machine].get_sys_root() or '' inc_paths = mesonlib.OrderedSet([sysroot + self.build_config['c_api']['headers']]) else: inc_paths = mesonlib.OrderedSet([ @@ -559,17 +559,11 @@ class PythonSystemDependency(SystemDependency, _PythonDependencyBase): if not self.clib_compiler.has_header('Python.h', '', extra_args=self.compile_args)[0]: self.is_found = False - @staticmethod - def log_tried() -> str: - return 'sysconfig' - -def python_factory(env: 'Environment', for_machine: 'MachineChoice', - kwargs: DependencyObjectKWs, +def python_factory(env: Environment, kwargs: DependencyObjectKWs, installation: T.Optional['BasicPythonExternalProgram'] = None) -> T.List['DependencyGenerator']: # We can't use the factory_methods decorator here, as we need to pass the # extra installation argument methods = process_method_kw({DependencyMethods.PKGCONFIG, DependencyMethods.SYSTEM}, kwargs) - embed = kwargs.get('embed', False) candidates: T.List['DependencyGenerator'] = [] from_installation = installation is not None # When not invoked through the python module, default installation. @@ -579,12 +573,18 @@ def python_factory(env: 'Environment', for_machine: 'MachineChoice', if DependencyMethods.PKGCONFIG in methods: if from_installation: - candidates.append(functools.partial(PythonPkgConfigDependency, env, kwargs, installation, embed, for_machine)) + candidates.append(DependencyCandidate( + functools.partial(PythonPkgConfigDependency, installation=installation), + 'python3', PythonPkgConfigDependency.type_name, arguments=(env, kwargs))) else: - candidates.append(functools.partial(PkgConfigDependency, 'python3', env, kwargs)) + candidates.append(DependencyCandidate.from_dependency( + 'python3', PkgConfigDependency, (env, kwargs))) if DependencyMethods.SYSTEM in methods: - candidates.append(functools.partial(PythonSystemDependency, 'python', env, kwargs, installation, for_machine)) + # This is a unique log-tried. + candidates.append(DependencyCandidate( + functools.partial(PythonSystemDependency, installation=installation), + 'python', 'sysconfig', arguments=(env, kwargs))) if DependencyMethods.EXTRAFRAMEWORK in methods: nkwargs = kwargs.copy() @@ -592,7 +592,9 @@ def python_factory(env: 'Environment', for_machine: 'MachineChoice', # There is a python in /System/Library/Frameworks, but that's python 2.x, # Python 3 will always be in /Library nkwargs['paths'] = ['/Library/Frameworks'] - candidates.append(functools.partial(PythonFrameworkDependency, 'Python', env, nkwargs, installation, for_machine)) + candidates.append(DependencyCandidate( + functools.partial(PythonFrameworkDependency, installation=installation), + 'python', PythonPkgConfigDependency.type_name, arguments=(env, nkwargs))) return candidates @@ -601,11 +603,11 @@ packages['python3'] = python_factory packages['pybind11'] = pybind11_factory = DependencyFactory( 'pybind11', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL, DependencyMethods.CMAKE], - configtool_class=Pybind11ConfigToolDependency, + configtool=Pybind11ConfigToolDependency, ) packages['numpy'] = numpy_factory = DependencyFactory( 'numpy', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL], - configtool_class=NumPyConfigToolDependency, + configtool=NumPyConfigToolDependency, ) diff --git a/mesonbuild/dependencies/qt.py b/mesonbuild/dependencies/qt.py index c245e5c8c..57cdd5ec6 100644 --- a/mesonbuild/dependencies/qt.py +++ b/mesonbuild/dependencies/qt.py @@ -97,8 +97,8 @@ def _get_modules_lib_suffix(version: str, info: 'MachineInfo', is_debug: bool) - class QtExtraFrameworkDependency(ExtraFrameworkDependency): - def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs, qvars: T.Dict[str, str], language: T.Optional[str] = None): - super().__init__(name, env, kwargs, language=language) + def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs, qvars: T.Dict[str, str]): + super().__init__(name, env, kwargs) self.mod_name = name[2:] self.qt_extra_include_directory = qvars['QT_INSTALL_HEADERS'] @@ -185,7 +185,7 @@ class QtPkgConfigDependency(_QtBase, PkgConfigDependency, metaclass=abc.ABCMeta) self.link_args = [] for m in self.requested_modules: - mod = PkgConfigDependency(self.qtpkgname + m, self.env, kwargs, language=self.language) + mod = PkgConfigDependency(self.qtpkgname + m, self.env, kwargs) if not mod.found(): self.is_found = False return @@ -357,11 +357,12 @@ class QmakeQtDependency(_QtBase, ConfigToolDependency, metaclass=abc.ABCMeta): fw_kwargs = kwargs.copy() fw_kwargs.pop('method') fw_kwargs['paths'] = [libdir] + fw_kwargs['language'] = self.language for m in modules: fname = 'Qt' + m mlog.debug('Looking for qt framework ' + fname) - fwdep = QtExtraFrameworkDependency(fname, self.env, fw_kwargs, qvars, language=self.language) + fwdep = QtExtraFrameworkDependency(fname, self.env, fw_kwargs, qvars) if fwdep.found(): self.compile_args.append('-F' + libdir) self.compile_args += fwdep.get_compile_args(with_private_headers=self.private_headers, @@ -467,20 +468,20 @@ class Qt6PkgConfigDependency(Qt6WinMainMixin, QtPkgConfigDependency): packages['qt4'] = qt4_factory = DependencyFactory( 'qt4', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL], - pkgconfig_class=Qt4PkgConfigDependency, - configtool_class=Qt4ConfigToolDependency, + pkgconfig=Qt4PkgConfigDependency, + configtool=Qt4ConfigToolDependency, ) packages['qt5'] = qt5_factory = DependencyFactory( 'qt5', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL], - pkgconfig_class=Qt5PkgConfigDependency, - configtool_class=Qt5ConfigToolDependency, + pkgconfig=Qt5PkgConfigDependency, + configtool=Qt5ConfigToolDependency, ) packages['qt6'] = qt6_factory = DependencyFactory( 'qt6', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL], - pkgconfig_class=Qt6PkgConfigDependency, - configtool_class=Qt6ConfigToolDependency, + pkgconfig=Qt6PkgConfigDependency, + configtool=Qt6ConfigToolDependency, ) diff --git a/mesonbuild/dependencies/scalapack.py b/mesonbuild/dependencies/scalapack.py index 26a6e3904..92276ce68 100644 --- a/mesonbuild/dependencies/scalapack.py +++ b/mesonbuild/dependencies/scalapack.py @@ -4,12 +4,11 @@ from __future__ import annotations from pathlib import Path -import functools import os import typing as T from ..options import OptionKey -from .base import DependencyException, DependencyMethods +from .base import DependencyCandidate, DependencyException, DependencyMethods from .cmake import CMakeDependency from .detect import packages from .pkgconfig import PkgConfigDependency @@ -17,13 +16,12 @@ from .factory import factory_methods if T.TYPE_CHECKING: from ..environment import Environment - from ..mesonlib import MachineChoice from .factory import DependencyGenerator from .base import DependencyObjectKWs @factory_methods({DependencyMethods.PKGCONFIG, DependencyMethods.CMAKE}) -def scalapack_factory(env: 'Environment', for_machine: 'MachineChoice', +def scalapack_factory(env: 'Environment', kwargs: DependencyObjectKWs, methods: T.List[DependencyMethods]) -> T.List['DependencyGenerator']: candidates: T.List['DependencyGenerator'] = [] @@ -31,16 +29,16 @@ def scalapack_factory(env: 'Environment', for_machine: 'MachineChoice', if DependencyMethods.PKGCONFIG in methods: static_opt = kwargs['static'] if kwargs.get('static') is not None else env.coredata.optstore.get_value_for(OptionKey('prefer_static')) mkl = 'mkl-static-lp64-iomp' if static_opt else 'mkl-dynamic-lp64-iomp' - candidates.append(functools.partial( - MKLPkgConfigDependency, mkl, env, kwargs)) + candidates.append(DependencyCandidate.from_dependency( + mkl, MKLPkgConfigDependency, (env, kwargs))) for pkg in ['scalapack-openmpi', 'scalapack']: - candidates.append(functools.partial( - PkgConfigDependency, pkg, env, kwargs)) + candidates.append(DependencyCandidate.from_dependency( + pkg, PkgConfigDependency, (env, kwargs))) if DependencyMethods.CMAKE in methods: - candidates.append(functools.partial( - CMakeDependency, 'Scalapack', env, kwargs)) + candidates.append(DependencyCandidate.from_dependency( + 'Scalapack', CMakeDependency, (env, kwargs))) return candidates @@ -55,15 +53,14 @@ class MKLPkgConfigDependency(PkgConfigDependency): bunch of fixups to make it work correctly. """ - def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs, - language: T.Optional[str] = None): + def __init__(self, name: str, env: 'Environment', kwargs: DependencyObjectKWs): _m = os.environ.get('MKLROOT') self.__mklroot = Path(_m).resolve() if _m else None # We need to call down into the normal super() method even if we don't # find mklroot, otherwise we won't have all of the instance variables # initialized that meson expects. - super().__init__(name, env, kwargs, language=language) + super().__init__(name, env, kwargs) # Doesn't work with gcc on windows, but does on Linux if env.machines[self.for_machine].is_windows() and self.clib_compiler.id == 'gcc': diff --git a/mesonbuild/dependencies/ui.py b/mesonbuild/dependencies/ui.py index 566ba52dc..0a1901b46 100644 --- a/mesonbuild/dependencies/ui.py +++ b/mesonbuild/dependencies/ui.py @@ -16,7 +16,8 @@ from ..mesonlib import ( Popen_safe, version_compare_many ) -from .base import DependencyException, DependencyMethods, DependencyTypeName, SystemDependency +from .base import DependencyCandidate, DependencyException, DependencyMethods, DependencyTypeName, SystemDependency +from .cmake import CMakeDependency from .configtool import ConfigToolDependency from .detect import packages from .factory import DependencyFactory @@ -56,8 +57,9 @@ class GnuStepDependency(ConfigToolDependency): tools = ['gnustep-config'] tool_name = 'gnustep-config' - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: - super().__init__('gnustep', environment, kwargs, language='objc') + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: + kwargs['language'] = 'objc' + super().__init__(name, environment, kwargs) if not self.is_found: return self.modules = kwargs.get('modules', []) @@ -148,8 +150,11 @@ class WxDependency(ConfigToolDependency): tools = ['wx-config-3.0', 'wx-config-3.1', 'wx-config', 'wx-config-gtk3'] tool_name = 'wx-config' - def __init__(self, environment: 'Environment', kwargs: DependencyObjectKWs): - super().__init__('WxWidgets', environment, kwargs, language='cpp') + # name is intentionally ignored to maintain existing capitalization, + # but is needed for polymorphism + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs): + kwargs['language'] = 'cpp' + super().__init__('WxWidgets', environment, kwargs) if not self.is_found: return self.requested_modules = kwargs.get('modules', []) @@ -174,8 +179,8 @@ packages['wxwidgets'] = WxDependency class VulkanDependencySystem(SystemDependency): - def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs, language: T.Optional[str] = None) -> None: - super().__init__(name, environment, kwargs, language=language) + def __init__(self, name: str, environment: 'Environment', kwargs: DependencyObjectKWs) -> None: + super().__init__(name, environment, kwargs) self.vulkan_sdk = os.environ.get('VULKAN_SDK', os.environ.get('VK_SDK_PATH')) if self.vulkan_sdk and not os.path.isabs(self.vulkan_sdk): @@ -270,18 +275,18 @@ class VulkanDependencySystem(SystemDependency): packages['gl'] = gl_factory = DependencyFactory( 'gl', [DependencyMethods.PKGCONFIG, DependencyMethods.SYSTEM], - system_class=GLDependencySystem, + system=GLDependencySystem, ) packages['sdl2'] = sdl2_factory = DependencyFactory( 'sdl2', [DependencyMethods.PKGCONFIG, DependencyMethods.CONFIG_TOOL, DependencyMethods.EXTRAFRAMEWORK, DependencyMethods.CMAKE], - configtool_class=SDL2DependencyConfigTool, - cmake_name='SDL2', + configtool=SDL2DependencyConfigTool, + cmake=DependencyCandidate.from_dependency('SDL2', CMakeDependency), ) packages['vulkan'] = vulkan_factory = DependencyFactory( 'vulkan', [DependencyMethods.PKGCONFIG, DependencyMethods.SYSTEM], - system_class=VulkanDependencySystem, + system=VulkanDependencySystem, ) diff --git a/mesonbuild/interpreter/compiler.py b/mesonbuild/interpreter/compiler.py index d5b14afbb..e79a20fed 100644 --- a/mesonbuild/interpreter/compiler.py +++ b/mesonbuild/interpreter/compiler.py @@ -47,7 +47,7 @@ if T.TYPE_CHECKING: class BaseCompileKW(TypedDict): no_builtin_args: bool - include_directories: T.List[build.IncludeDirs] + include_directories: T.List[T.Union[str, build.IncludeDirs]] args: T.List[str] class CompileKW(BaseCompileKW, ExtractRequired): @@ -165,14 +165,23 @@ _NO_BUILTIN_ARGS_KW = KwargInfo('no_builtin_args', bool, default=False) _NAME_KW = KwargInfo('name', str, default='') _WERROR_KW = KwargInfo('werror', bool, default=False, since='1.3.0') +_INCLUDE_DIRECTORIES_KW = INCLUDE_DIRECTORIES.evolve( + since_values={ContainerTypeInfo(list, str): '1.10.0'} +) + # Many of the compiler methods take this kwarg signature exactly, this allows # simplifying the `typed_kwargs` calls -_COMMON_KWS: T.List[KwargInfo] = [_ARGS_KW, _DEPENDENCIES_KW, INCLUDE_DIRECTORIES, _PREFIX_KW, _NO_BUILTIN_ARGS_KW] +_COMMON_KWS: T.List[KwargInfo] = [ + _ARGS_KW, _DEPENDENCIES_KW, _INCLUDE_DIRECTORIES_KW, _PREFIX_KW, + _NO_BUILTIN_ARGS_KW, +] # Common methods of compiles, links, runs, and similar -_COMPILES_KWS: T.List[KwargInfo] = [_NAME_KW, _ARGS_KW, _DEPENDENCIES_KW, INCLUDE_DIRECTORIES, _NO_BUILTIN_ARGS_KW, - _WERROR_KW, - REQUIRED_KW.evolve(since='1.5.0', default=False)] +_COMPILES_KWS: T.List[KwargInfo] = [ + _NAME_KW, _ARGS_KW, _DEPENDENCIES_KW, _INCLUDE_DIRECTORIES_KW, + _NO_BUILTIN_ARGS_KW, _WERROR_KW, + REQUIRED_KW.evolve(since='1.5.0', default=False), +] _HEADER_KWS: T.List[KwargInfo] = [REQUIRED_KW.evolve(since='0.50.0', default=False), *_COMMON_KWS] _HAS_REQUIRED_KW = REQUIRED_KW.evolve(since='1.3.0', default=False) @@ -226,7 +235,7 @@ class CompilerHolder(ObjectHolder['Compiler']): def _determine_args(self, kwargs: BaseCompileKW, mode: CompileCheckMode = CompileCheckMode.LINK) -> T.List[str]: args: T.List[str] = [] - for i in self.interpreter.extract_incdirs(kwargs, strings_since='1.10.0'): + for i in self.interpreter.extract_incdirs(kwargs['include_directories']): for idir in i.to_string_list(self.environment.get_source_dir(), self.environment.get_build_dir()): args.extend(self.compiler.get_include_args(idir, False)) if not kwargs['no_builtin_args']: @@ -649,6 +658,7 @@ class CompilerHolder(ObjectHolder['Compiler']): lib = dependencies.ExternalLibrary(libname, None, self.environment, self.compiler.language, + self.held_object.for_machine, silent=True) return lib @@ -673,15 +683,13 @@ class CompilerHolder(ObjectHolder['Compiler']): mlog.log('Library', mlog.bold(libname), 'skipped: feature', mlog.bold(feature), 'disabled') return self.notfound_library(libname) - include_directories = self.interpreter.extract_incdirs(kwargs, key='header_include_directories', strings_since='1.10.0') - # This could be done with a comprehension, but that confuses the type # checker, and having it check this seems valuable has_header_kwargs: 'HeaderKW' = { 'required': required, 'args': kwargs['header_args'], 'dependencies': kwargs['header_dependencies'], - 'include_directories': include_directories, + 'include_directories': kwargs['header_include_directories'], 'prefix': kwargs['header_prefix'], 'no_builtin_args': kwargs['header_no_builtin_args'], } @@ -710,7 +718,7 @@ class CompilerHolder(ObjectHolder['Compiler']): .format(self.compiler.get_display_language(), libtype_s, libname)) lib = dependencies.ExternalLibrary(libname, linkargs, self.environment, - self.compiler.language) + self.compiler.language, self.held_object.for_machine) return lib def _has_argument_impl(self, arguments: T.Union[str, T.List[str]], @@ -878,7 +886,7 @@ class CompilerHolder(ObjectHolder['Compiler']): 'compiler.preprocess', KwargInfo('output', str, default='@PLAINNAME@.i'), KwargInfo('compile_args', ContainerTypeInfo(list, str), listify=True, default=[]), - INCLUDE_DIRECTORIES, + _INCLUDE_DIRECTORIES_KW, _DEPENDENCIES_KW.evolve(since='1.1.0'), _DEPENDS_KW.evolve(since='1.4.0'), ) @@ -906,7 +914,7 @@ class CompilerHolder(ObjectHolder['Compiler']): compiler, self.interpreter.backend, kwargs['compile_args'], - self.interpreter.extract_incdirs(kwargs, strings_since='1.10.0'), + self.interpreter.extract_incdirs(kwargs['include_directories']), kwargs['dependencies'], kwargs['depends']) self.interpreter.add_target(tg.name, tg) diff --git a/mesonbuild/interpreter/dependencyfallbacks.py b/mesonbuild/interpreter/dependencyfallbacks.py index baf5ea39e..4e4dcde56 100644 --- a/mesonbuild/interpreter/dependencyfallbacks.py +++ b/mesonbuild/interpreter/dependencyfallbacks.py @@ -8,7 +8,7 @@ from .. import mlog from .. import dependencies from .. import build from ..wrap import WrapMode -from ..mesonlib import stringlistify, version_compare_many, MachineChoice +from ..mesonlib import stringlistify, version_compare_many from ..options import OptionKey from ..dependencies import Dependency, DependencyException, NotFoundDependency from ..interpreterbase import (MesonInterpreterObject, FeatureNew, @@ -96,7 +96,7 @@ class DependencyFallbacksHolder(MesonInterpreterObject): self._handle_featurenew_dependencies(name) dep = dependencies.find_external_dependency(name, self.environment, kwargs) if dep.found(): - for_machine = kwargs.get('native', MachineChoice.HOST) + for_machine = kwargs['native'] identifier = dependencies.get_dep_identifier(name, kwargs) self.coredata.deps[for_machine].put(identifier, dep) return dep @@ -209,7 +209,7 @@ class DependencyFallbacksHolder(MesonInterpreterObject): # of None in the case the dependency is cached as not-found, or if cached # version does not match. In that case we don't want to continue with # other candidates. - for_machine = kwargs.get('native', MachineChoice.HOST) + for_machine = kwargs['native'] identifier = dependencies.get_dep_identifier(name, kwargs) wanted_vers = stringlistify(kwargs.get('version', [])) @@ -366,7 +366,7 @@ class DependencyFallbacksHolder(MesonInterpreterObject): # Override this dependency to have consistent results in subsequent # dependency lookups. for name in self.names: - for_machine = kwargs.get('native', MachineChoice.HOST) + for_machine = kwargs['native'] identifier = dependencies.get_dep_identifier(name, kwargs) if identifier not in self.build.dependency_overrides[for_machine]: self.build.dependency_overrides[for_machine][identifier] = \ diff --git a/mesonbuild/interpreter/interpreter.py b/mesonbuild/interpreter/interpreter.py index e3d321e9a..323daefed 100644 --- a/mesonbuild/interpreter/interpreter.py +++ b/mesonbuild/interpreter/interpreter.py @@ -673,7 +673,7 @@ class Interpreter(InterpreterBase, HoldableObject): D_MODULE_VERSIONS_KW.evolve(since='0.62.0'), KwargInfo('link_args', ContainerTypeInfo(list, str), listify=True, default=[]), DEPENDENCIES_KW, - INCLUDE_DIRECTORIES, + INCLUDE_DIRECTORIES.evolve(since_values={ContainerTypeInfo(list, str): '0.50.0'}), LINK_WITH_KW, LINK_WHOLE_KW.evolve(since='0.46.0'), DEPENDENCY_SOURCES_KW, @@ -685,7 +685,7 @@ class Interpreter(InterpreterBase, HoldableObject): def func_declare_dependency(self, node: mparser.BaseNode, args: T.List[TYPE_var], kwargs: kwtypes.FuncDeclareDependency) -> dependencies.Dependency: deps = kwargs['dependencies'] - incs = self.extract_incdirs(kwargs, strings_since='0.50.0') + incs = self.extract_incdirs(kwargs['include_directories']) libs = kwargs['link_with'] libs_whole = kwargs['link_whole'] objects = kwargs['objects'] @@ -698,7 +698,7 @@ class Interpreter(InterpreterBase, HoldableObject): if version is None: version = self.project_version d_module_versions = kwargs['d_module_versions'] - d_import_dirs = self.extract_incdirs(kwargs, 'd_import_dirs') + d_import_dirs = self.extract_incdirs(kwargs['d_import_dirs'], True) srcdir = self.environment.source_dir subproject_dir = os.path.abspath(os.path.join(srcdir, self.subproject_dir)) project_root = os.path.abspath(os.path.join(srcdir, self.root_subdir)) @@ -1060,7 +1060,11 @@ class Interpreter(InterpreterBase, HoldableObject): except cargo.TomlImplementationMissing as e: raise MesonException(f'Failed to load Cargo.lock: {e!s}') - ast = cargo_int.interpret(subdir) + if os.path.exists(os.path.join(self.environment.get_source_dir(), subdir, environment.build_filename)): + ast = None + else: + ast = cargo_int.interpret(subdir) + return self._do_subproject_meson( subp_name, subdir, default_options, kwargs, ast, relaxations={InterpreterRuleRelaxation.CARGO_SUBDIR}, @@ -2792,29 +2796,21 @@ class Interpreter(InterpreterBase, HoldableObject): install_tag=install_tag, data_type='configure')) return mesonlib.File.from_built_file(self.subdir, output) - def extract_incdirs(self, kwargs, key: str = 'include_directories', strings_since: T.Optional[str] = None) -> T.List[build.IncludeDirs]: - prospectives = extract_as_list(kwargs, key) - if strings_since: - for i in prospectives: - if isinstance(i, str): - FeatureNew.single_use(f'{key} kwarg of type string', strings_since, self.subproject, - f'Use include_directories({i!r}) instead', location=self.current_node) - break - + def extract_incdirs(self, prospectives: T.List[T.Union[str, build.IncludeDirs]], + is_d_import_dirs: bool = False + ) -> T.List[build.IncludeDirs]: result: T.List[build.IncludeDirs] = [] for p in prospectives: if isinstance(p, build.IncludeDirs): result.append(p) - elif isinstance(p, str): - if key == 'd_import_dirs' and os.path.normpath(p).startswith(self.environment.get_source_dir()): + else: + if is_d_import_dirs and os.path.normpath(p).startswith(self.environment.get_source_dir()): FeatureDeprecated.single_use('Building absolute path to source dir is not supported', '0.45', self.subproject, 'Use a relative path instead.', location=self.current_node) p = os.path.relpath(p, os.path.join(self.environment.get_source_dir(), self.subdir)) result.append(self.build_incdir_object([p])) - else: - raise InterpreterException('Include directory objects can only be created from strings or include directories.') return result @typed_pos_args('include_directories', varargs=str) @@ -3481,7 +3477,8 @@ class Interpreter(InterpreterBase, HoldableObject): if targetclass is not build.Jar: self.check_for_jar_sources(sources, targetclass) - kwargs['d_import_dirs'] = self.extract_incdirs(kwargs, 'd_import_dirs') + kwargs['include_directories'] = self.extract_incdirs(kwargs['include_directories']) + kwargs['d_import_dirs'] = self.extract_incdirs(kwargs['d_import_dirs'], True) missing: T.List[str] = [] for each in itertools.chain(kwargs['c_pch'] or [], kwargs['cpp_pch'] or []): if each is not None: @@ -3525,8 +3522,6 @@ class Interpreter(InterpreterBase, HoldableObject): node=node) outputs.update(o) - kwargs['include_directories'] = self.extract_incdirs(kwargs, strings_since='0.50.0') - if targetclass is build.Executable: kwargs = T.cast('kwtypes.Executable', kwargs) if kwargs['gui_app'] is not None: @@ -3568,7 +3563,7 @@ class Interpreter(InterpreterBase, HoldableObject): for l in target.compilers.keys(): dep = self.build.stdlibs[target.for_machine].get(l, None) if dep: - target.add_deps(dep) + target.add_deps([dep]) def check_sources_exist(self, subdir, sources): for s in sources: diff --git a/mesonbuild/interpreter/kwargs.py b/mesonbuild/interpreter/kwargs.py index 1e206422f..6adad72e7 100644 --- a/mesonbuild/interpreter/kwargs.py +++ b/mesonbuild/interpreter/kwargs.py @@ -337,8 +337,10 @@ class _BaseBuildTarget(TypedDict): build_by_default: bool build_rpath: str + dependencies: T.List[Dependency] extra_files: T.List[FileOrString] gnu_symbol_visibility: str + include_directories: T.List[build.IncludeDirs] install: bool install_mode: FileMode install_tag: T.Optional[str] @@ -346,6 +348,8 @@ class _BaseBuildTarget(TypedDict): implicit_include_directories: bool link_depends: T.List[T.Union[str, File, build.GeneratedTypes]] link_language: T.Optional[str] + link_whole: T.List[build.StaticTargetTypes] + link_with: T.List[build.BuildTargetTypes] name_prefix: T.Optional[str] name_suffix: T.Optional[str] native: MachineChoice @@ -366,6 +370,7 @@ class _BuildTarget(_BaseBuildTarget): d_import_dirs: T.List[T.Union[str, build.IncludeDirs]] d_module_versions: T.List[T.Union[str, int]] d_unittest: bool + install_dir: T.List[T.Union[str, bool]] rust_crate_type: T.Optional[Literal['bin', 'lib', 'rlib', 'dylib', 'cdylib', 'staticlib', 'proc-macro']] rust_dependency_map: T.Dict[str, str] swift_interoperability_mode: Literal['c', 'cpp'] @@ -489,8 +494,8 @@ class FuncDeclareDependency(TypedDict): extra_files: T.List[FileOrString] include_directories: T.List[T.Union[build.IncludeDirs, str]] link_args: T.List[str] - link_whole: T.List[T.Union[build.StaticLibrary, build.CustomTarget, build.CustomTargetIndex]] - link_with: T.List[build.LibTypes] + link_whole: T.List[build.StaticTargetTypes] + link_with: T.List[build.BuildTargetTypes] objects: T.List[build.ExtractedObjects] sources: T.List[T.Union[FileOrString, build.GeneratedTypes]] variables: T.Dict[str, str] diff --git a/mesonbuild/interpreter/type_checking.py b/mesonbuild/interpreter/type_checking.py index 743046caf..3512925ee 100644 --- a/mesonbuild/interpreter/type_checking.py +++ b/mesonbuild/interpreter/type_checking.py @@ -426,6 +426,9 @@ DEPENDENCIES_KW: KwargInfo[T.List[Dependency]] = KwargInfo( ContainerTypeInfo(list, (Dependency, InternalDependency)), listify=True, default=[], + extra_types={ + BuildTarget: lambda arg: f'Tried to use a build_target "{T.cast("BuildTarget", arg).name}" as a dependency. This should be in `link_with` or `link_whole` instead.', + }, ) D_MODULE_VERSIONS_KW: KwargInfo[T.List[T.Union[str, int]]] = KwargInfo( @@ -435,31 +438,42 @@ D_MODULE_VERSIONS_KW: KwargInfo[T.List[T.Union[str, int]]] = KwargInfo( default=[], ) -_link_with_error = '''can only be self-built targets, external dependencies (including libraries) must go in "dependencies".''' +_LINK_WITH_ERROR = 'Dependency and external_library objects must go in the "dependencies" keyword argument' + +def _link_with_validator(values: T.List[T.Union[BothLibraries, SharedLibrary, StaticLibrary, + CustomTarget, CustomTargetIndex, Jar, Executable, + ]] + ) -> T.Optional[str]: + for value in values: + if not value.is_linkable_target(): + return f'Link target "{value!s}" is not linkable' + return None # Allow Dependency for the better error message? But then in other cases it will list this as one of the allowed types! LINK_WITH_KW: KwargInfo[T.List[T.Union[BothLibraries, SharedLibrary, StaticLibrary, CustomTarget, CustomTargetIndex, Jar, Executable]]] = KwargInfo( 'link_with', - ContainerTypeInfo(list, (BothLibraries, SharedLibrary, StaticLibrary, CustomTarget, CustomTargetIndex, Jar, Executable, Dependency)), + ContainerTypeInfo(list, (BothLibraries, SharedLibrary, StaticLibrary, CustomTarget, CustomTargetIndex, Jar, Executable)), listify=True, default=[], - validator=lambda x: _link_with_error if any(isinstance(i, Dependency) for i in x) else None, + extra_types={Dependency: lambda _: _LINK_WITH_ERROR}, + validator=_link_with_validator, ) -def link_whole_validator(values: T.List[T.Union[StaticLibrary, CustomTarget, CustomTargetIndex, Dependency]]) -> T.Optional[str]: +def link_whole_validator(values: T.List[T.Union[StaticLibrary, CustomTarget, CustomTargetIndex]]) -> T.Optional[str]: for l in values: if isinstance(l, (CustomTarget, CustomTargetIndex)) and l.links_dynamically(): return f'{type(l).__name__} returning a shared library is not allowed' - if isinstance(l, Dependency): - return _link_with_error + if not l.is_linkable_target(): + return f'Link target "{l!s}" is not linkable' return None LINK_WHOLE_KW: KwargInfo[T.List[T.Union[BothLibraries, StaticLibrary, CustomTarget, CustomTargetIndex]]] = KwargInfo( 'link_whole', - ContainerTypeInfo(list, (BothLibraries, StaticLibrary, CustomTarget, CustomTargetIndex, Dependency)), + ContainerTypeInfo(list, (BothLibraries, StaticLibrary, CustomTarget, CustomTargetIndex)), listify=True, default=[], validator=link_whole_validator, + extra_types={Dependency: lambda _: _LINK_WITH_ERROR} ) DEPENDENCY_SOURCES_KW: KwargInfo[T.List[T.Union[str, File, GeneratedTypes]]] = KwargInfo( @@ -615,6 +629,12 @@ _ALL_TARGET_KWS: T.List[KwargInfo] = [ ), INSTALL_MODE_KW, INSTALL_TAG_KW, + KwargInfo( + 'install_dir', + ContainerTypeInfo(list, (str, bool)), + default=[], + listify=True, + ), KwargInfo('implicit_include_directories', bool, default=True, since='0.42.0'), NATIVE_KW, KwargInfo('resources', ContainerTypeInfo(list, str), default=[], listify=True), @@ -715,7 +735,11 @@ _BUILD_TARGET_KWS: T.List[KwargInfo] = [ *_ALL_TARGET_KWS, *_LANGUAGE_KWS, BT_SOURCES_KW, + INCLUDE_DIRECTORIES.evolve(since_values={ContainerTypeInfo(list, str): '0.50.0'}), + DEPENDENCIES_KW, INCLUDE_DIRECTORIES.evolve(name='d_import_dirs'), + LINK_WHOLE_KW, + LINK_WITH_KW, _NAME_PREFIX_KW, _NAME_PREFIX_KW.evolve(name='name_suffix', validator=_name_suffix_validator), RUST_CRATE_TYPE_KW, diff --git a/mesonbuild/interpreterbase/decorators.py b/mesonbuild/interpreterbase/decorators.py index 486a58573..591586523 100644 --- a/mesonbuild/interpreterbase/decorators.py +++ b/mesonbuild/interpreterbase/decorators.py @@ -353,6 +353,10 @@ class KwargInfo(T.Generic[_T]): :param not_set_warning: A warning message that is logged if the kwarg is not set by the user. :param feature_validator: A callable returning an iterable of FeatureNew | FeatureDeprecated objects. + :param extra_types: + A mapping of types to a callable that is passed that type and returns an + error message. These types are specifically *not* added to the general + error message """ def __init__(self, name: str, types: T.Union[T.Type[_T], T.Tuple[T.Union[T.Type[_T], ContainerTypeInfo], ...], ContainerTypeInfo], @@ -367,7 +371,8 @@ class KwargInfo(T.Generic[_T]): feature_validator: T.Optional[T.Callable[[_T], T.Iterable[FeatureCheckBase]]] = None, validator: T.Optional[T.Callable[[T.Any], T.Optional[str]]] = None, convertor: T.Optional[T.Callable[[_T], object]] = None, - not_set_warning: T.Optional[str] = None): + not_set_warning: T.Optional[str] = None, + extra_types: T.Optional[T.Mapping[T.Type, T.Callable[[object], str]]] = None): self.name = name self.types = types self.required = required @@ -383,6 +388,7 @@ class KwargInfo(T.Generic[_T]): self.validator = validator self.convertor = convertor self.not_set_warning = not_set_warning + self.extra_types = extra_types if extra_types is not None else {} def evolve(self, *, name: T.Union[str, _NULL_T] = _NULL, @@ -397,7 +403,9 @@ class KwargInfo(T.Generic[_T]): deprecated_values: T.Union[T.Dict[T.Union[_T, ContainerTypeInfo, type], T.Union[str, T.Tuple[str, str]]], None, _NULL_T] = _NULL, feature_validator: T.Union[T.Callable[[_T], T.Iterable[FeatureCheckBase]], None, _NULL_T] = _NULL, validator: T.Union[T.Callable[[_T], T.Optional[str]], None, _NULL_T] = _NULL, - convertor: T.Union[T.Callable[[_T], object], None, _NULL_T] = _NULL) -> 'KwargInfo': + convertor: T.Union[T.Callable[[_T], object], None, _NULL_T] = _NULL, + extra_types: T.Union[T.Mapping[T.Type, T.Callable[[object], str]], None, _NULL_T] = _NULL, + ) -> 'KwargInfo': """Create a shallow copy of this KwargInfo, with modifications. This allows us to create a new copy of a KwargInfo with modifications. @@ -424,6 +432,7 @@ class KwargInfo(T.Generic[_T]): feature_validator=feature_validator if not isinstance(feature_validator, _NULL_T) else self.feature_validator, validator=validator if not isinstance(validator, _NULL_T) else self.validator, convertor=convertor if not isinstance(convertor, _NULL_T) else self.convertor, + extra_types=extra_types if not isinstance(extra_types, _NULL_T) else self.extra_types, ) @@ -529,7 +538,20 @@ def typed_kwargs(name: str, *types: KwargInfo, allow_unknown: bool = False) -> T if info.listify: kwargs[info.name] = value = mesonlib.listify(value) if not check_value_type(types_tuple, value): + extra_desc: T.List[str] = [] + if info.extra_types: + if isinstance(value, list): + for (t, cb), v in itertools.product(info.extra_types.items(), value): + if isinstance(v, t): + extra_desc.append(cb(v)) + else: + for t, cb in info.extra_types.items(): + if isinstance(value, t): + extra_desc.append(cb(value)) + shouldbe = types_description(types_tuple) + if extra_desc: + shouldbe = '{}. {}'.format(shouldbe, '. '.join(extra_desc)) raise InvalidArguments(f'{name} keyword argument {info.name!r} was of type {raw_description(value)} but should have been {shouldbe}') if info.validator is not None: diff --git a/mesonbuild/mdevenv.py b/mesonbuild/mdevenv.py index e6c4fad4d..46ec14824 100644 --- a/mesonbuild/mdevenv.py +++ b/mesonbuild/mdevenv.py @@ -84,7 +84,7 @@ def bash_completion_files(b: build.Build, install_data: 'InstallData') -> T.List from .dependencies.pkgconfig import PkgConfigDependency result = [] dep = PkgConfigDependency('bash-completion', b.environment, - {'required': False, 'silent': True, 'version': ['>=2.10']}) + {'required': False, 'silent': True, 'version': ['>=2.10'], 'native': MachineChoice.HOST}) if dep.found(): prefix = b.environment.coredata.optstore.get_value_for(OptionKey('prefix')) assert isinstance(prefix, str), 'for mypy' diff --git a/mesonbuild/mintro.py b/mesonbuild/mintro.py index b1e0941c8..28900492f 100644 --- a/mesonbuild/mintro.py +++ b/mesonbuild/mintro.py @@ -228,12 +228,12 @@ def list_targets(builddata: build.Build, installdata: backends.InstallData, back if not isinstance(target, build.Target): raise RuntimeError('The target object in `builddata.get_targets()` is not of type `build.Target`. Please file a bug with this error message.') - outdir = get_target_dir(builddata.environment.coredata, target.subdir) + outdir = get_target_dir(builddata.environment.coredata, target.get_builddir()) t = { 'name': target.get_basename(), 'id': idname, 'type': target.get_typename(), - 'defined_in': os.path.normpath(os.path.join(src_dir, target.subdir, environment.build_filename)), + 'defined_in': os.path.normpath(os.path.join(src_dir, target.get_subdir(), environment.build_filename)), 'filename': [os.path.join(build_dir, outdir, x) for x in target.get_outputs()], 'build_by_default': target.build_by_default, 'target_sources': backend.get_introspection_data(idname, target), diff --git a/mesonbuild/modules/__init__.py b/mesonbuild/modules/__init__.py index 3ff9368d9..0df85faf1 100644 --- a/mesonbuild/modules/__init__.py +++ b/mesonbuild/modules/__init__.py @@ -7,7 +7,7 @@ from __future__ import annotations import dataclasses import typing as T -from .. import build, mesonlib +from .. import build, dependencies, mesonlib, mlog from ..options import OptionKey from ..build import IncludeDirs from ..interpreterbase.decorators import noKwargs, noPosargs @@ -15,6 +15,7 @@ from ..mesonlib import relpath, HoldableObject, MachineChoice from ..programs import ExternalProgram if T.TYPE_CHECKING: + from ..dependencies.base import DependencyObjectKWs from ..interpreter import Interpreter from ..interpreter.interpreter import ProgramVersionFunc from ..interpreterbase import TYPE_var, TYPE_kwargs @@ -36,6 +37,7 @@ class ModuleState: self.source_root = interpreter.environment.get_source_dir() self.build_to_src = relpath(interpreter.environment.get_source_dir(), interpreter.environment.get_build_dir()) + self.subproject_dir = interpreter.subproject_dir self.subproject = interpreter.subproject self.subdir = interpreter.subdir self.root_subdir = interpreter.root_subdir @@ -46,6 +48,7 @@ class ModuleState: # The backend object is under-used right now, but we will need it: # https://github.com/mesonbuild/meson/issues/1419 self.backend = interpreter.backend + self.dependency_overrides = interpreter.build.dependency_overrides self.targets = interpreter.build.targets self.data = interpreter.build.data self.headers = interpreter.build.get_headers() @@ -108,8 +111,29 @@ class ModuleState: # Normal program lookup return self.find_program(name, required=required, wanted=wanted) + def override_dependency(self, depname: str, dep: Dependency, static: T.Optional[bool] = None, + for_machine: MachineChoice = MachineChoice.HOST) -> None: + kwargs: DependencyObjectKWs = {} + if static is not None: + kwargs['static'] = static + identifier = dependencies.get_dep_identifier(depname, kwargs) + override = self.dependency_overrides[for_machine].get(identifier) + if override: + m = 'Tried to override dependency {!r} which has already been resolved or overridden at {}' + location = mlog.get_error_location_string(override.node.filename, override.node.lineno) + raise mesonlib.MesonException(m.format(depname, location)) + self.dependency_overrides[for_machine][identifier] = \ + build.DependencyOverride(dep, self._interpreter.current_node) + + def overridden_dependency(self, depname: str, for_machine: MachineChoice = MachineChoice.HOST) -> Dependency: + identifier = dependencies.get_dep_identifier(depname, {}) + try: + return self.dependency_overrides[for_machine][identifier].dep + except KeyError: + raise mesonlib.MesonException(f'dependency "{depname}" was not overridden for the {for_machine}') + def dependency(self, depname: str, native: bool = False, required: bool = True, - wanted: T.Optional[str] = None) -> 'Dependency': + wanted: T.Optional[T.Union[str, T.List[str]]] = None) -> 'Dependency': kwargs: T.Dict[str, object] = {'native': native, 'required': required} if wanted: kwargs['version'] = wanted diff --git a/mesonbuild/modules/_qt.py b/mesonbuild/modules/_qt.py index ae6e2d477..832e808bf 100644 --- a/mesonbuild/modules/_qt.py +++ b/mesonbuild/modules/_qt.py @@ -15,7 +15,7 @@ from .. import build from .. import options from .. import mlog from ..dependencies import DependencyMethods, find_external_dependency, Dependency, ExternalLibrary, InternalDependency -from ..mesonlib import MesonException, File, FileMode, version_compare, Popen_safe +from ..mesonlib import MachineChoice, MesonException, File, FileMode, version_compare, Popen_safe from ..interpreter import extract_required_kwarg from ..interpreter.type_checking import DEPENDENCY_METHOD_KW, INSTALL_DIR_KW, INSTALL_KW, NoneType from ..interpreterbase import ContainerTypeInfo, FeatureDeprecated, KwargInfo, noPosargs, FeatureNew, typed_kwargs, typed_pos_args @@ -270,7 +270,7 @@ class QtBaseModule(ExtensionModule): return self._tools_detected = True mlog.log(f'Detecting Qt{self.qt_version} tools') - kwargs: DependencyObjectKWs = {'required': required, 'modules': ['Core'], 'method': method} + kwargs: DependencyObjectKWs = {'required': required, 'modules': ['Core'], 'method': method, 'native': MachineChoice.HOST} # Just pick one to make mypy happy qt = T.cast('QtPkgConfigDependency', find_external_dependency(f'qt{self.qt_version}', state.environment, kwargs)) if qt.found(): diff --git a/mesonbuild/modules/gnome.py b/mesonbuild/modules/gnome.py index 0f522c106..758a213eb 100644 --- a/mesonbuild/modules/gnome.py +++ b/mesonbuild/modules/gnome.py @@ -829,8 +829,8 @@ class GnomeModule(ExtensionModule): if isinstance(inc, str): ret += [f'--include={inc}'] elif isinstance(inc, GirTarget): - gir_inc_dirs .append(os.path.join(state.environment.get_build_dir(), inc.get_subdir())) - ret.append(f"--include-uninstalled={os.path.join(inc.get_subdir(), inc.get_basename())}") + gir_inc_dirs .append(os.path.join(state.environment.get_build_dir(), inc.get_builddir())) + ret.append(f"--include-uninstalled={os.path.join(inc.get_builddir(), inc.get_basename())}") depends.append(inc) return ret, gir_inc_dirs, depends diff --git a/mesonbuild/modules/i18n.py b/mesonbuild/modules/i18n.py index 2d8d04d3e..3f021dd0d 100644 --- a/mesonbuild/modules/i18n.py +++ b/mesonbuild/modules/i18n.py @@ -485,7 +485,7 @@ class I18nModule(ExtensionModule): mo_fnames = [] for target in mo_targets: - mo_fnames.append(path.join(target.get_subdir(), target.get_outputs()[0])) + mo_fnames.append(path.join(target.get_builddir(), target.get_outputs()[0])) command: T.List[T.Union[str, build.BuildTargetTypes, ExternalProgram, mesonlib.File]] = [] command.extend(state.environment.get_build_command()) diff --git a/mesonbuild/modules/python.py b/mesonbuild/modules/python.py index 99602c05a..23b2aa788 100644 --- a/mesonbuild/modules/python.py +++ b/mesonbuild/modules/python.py @@ -51,6 +51,8 @@ if T.TYPE_CHECKING: class ExtensionModuleKw(SharedModuleKw): + # Yes, these are different between SharedModule and ExtensionModule + install_dir: T.Union[str, bool, None] # type: ignore[misc] subdir: NotRequired[T.Optional[str]] MaybePythonProg = T.Union[NonExistingExternalProgram, 'PythonExternalProgram'] @@ -60,7 +62,8 @@ mod_kwargs = {'subdir', 'limited_api'} mod_kwargs.update(known_shmod_kwargs) mod_kwargs -= {'name_prefix', 'name_suffix'} -_MOD_KWARGS = [k for k in SHARED_MOD_KWS if k.name not in {'name_prefix', 'name_suffix'}] +_MOD_KWARGS = [k for k in SHARED_MOD_KWS if + k.name not in {'name_prefix', 'name_suffix', 'install_dir'}] class PythonExternalProgram(BasicPythonExternalProgram): @@ -139,26 +142,35 @@ class PythonInstallation(_ExternalProgramHolder['PythonExternalProgram']): @permittedKwargs(mod_kwargs) @typed_pos_args('python.extension_module', str, varargs=(str, mesonlib.File, CustomTarget, CustomTargetIndex, GeneratedList, StructuredSources, ExtractedObjects, BuildTarget)) - @typed_kwargs('python.extension_module', *_MOD_KWARGS, _DEFAULTABLE_SUBDIR_KW, _LIMITED_API_KW, allow_unknown=True) + @typed_kwargs( + 'python.extension_module', + *_MOD_KWARGS, + _DEFAULTABLE_SUBDIR_KW, + _LIMITED_API_KW, + KwargInfo('install_dir', (str, bool, NoneType)), + allow_unknown=True + ) @InterpreterObject.method('extension_module') def extension_module_method(self, args: T.Tuple[str, T.List[BuildTargetSource]], kwargs: ExtensionModuleKw) -> 'SharedModule': - if 'install_dir' in kwargs: + if kwargs['install_dir'] is not None: if kwargs['subdir'] is not None: raise InvalidArguments('"subdir" and "install_dir" are mutually exclusive') + # the build_target() method now expects this to be correct. + kwargs['install_dir'] = [kwargs['install_dir']] else: # We want to remove 'subdir', but it may be None and we want to replace it with '' # It must be done this way since we don't allow both `install_dir` # and `subdir` to be set at the same time subdir = kwargs.pop('subdir') or '' - kwargs['install_dir'] = self._get_install_dir_impl(False, subdir) + kwargs['install_dir'] = [self._get_install_dir_impl(False, subdir)] target_suffix = self.suffix new_deps = mesonlib.extract_as_list(kwargs, 'dependencies') pydep = next((dep for dep in new_deps if isinstance(dep, _PythonDependencyBase)), None) if pydep is None: - pydep = self._dependency_method_impl({}) + pydep = self._dependency_method_impl({'native': kwargs['native']}) if not pydep.found(): raise mesonlib.MesonException('Python dependency not found') new_deps.append(pydep) @@ -247,7 +259,7 @@ class PythonInstallation(_ExternalProgramHolder['PythonExternalProgram']): return '0x{:02x}{:02x}0000'.format(major, minor) def _dependency_method_impl(self, kwargs: DependencyObjectKWs) -> Dependency: - for_machine = kwargs.get('native', MachineChoice.HOST) + for_machine = kwargs['native'] identifier = get_dep_identifier(self._full_path(), kwargs) dep = self.interpreter.coredata.deps[for_machine].get(identifier) @@ -260,7 +272,7 @@ class PythonInstallation(_ExternalProgramHolder['PythonExternalProgram']): new_kwargs['required'] = False if build_config: new_kwargs['build_config'] = build_config - candidates = python_factory(self.interpreter.environment, for_machine, new_kwargs, self.held_object) + candidates = python_factory(self.interpreter.environment, new_kwargs, self.held_object) dep = find_external_dependency('python', self.interpreter.environment, new_kwargs, candidates) self.interpreter.coredata.deps[for_machine].put(identifier, dep) diff --git a/mesonbuild/modules/rust.py b/mesonbuild/modules/rust.py index a8fcc86a0..783df1865 100644 --- a/mesonbuild/modules/rust.py +++ b/mesonbuild/modules/rust.py @@ -10,29 +10,36 @@ import typing as T from mesonbuild.interpreterbase.decorators import FeatureNew -from . import ExtensionModule, ModuleReturnValue, ModuleInfo +from . import ExtensionModule, ModuleReturnValue, ModuleInfo, ModuleObject from .. import mesonlib, mlog from ..build import (BothLibraries, BuildTarget, CustomTargetIndex, Executable, ExtractedObjects, GeneratedList, - CustomTarget, InvalidArguments, Jar, StructuredSources, SharedLibrary, StaticLibrary) + CustomTarget, InvalidArguments, Jar, StructuredSources, SharedLibrary, StaticLibrary, + SharedModule) from ..compilers.compilers import are_asserts_disabled_for_subproject, lang_suffixes +from ..compilers.rust import RustSystemDependency +from ..dependencies import Dependency from ..interpreter.type_checking import ( DEPENDENCIES_KW, LINK_WITH_KW, LINK_WHOLE_KW, SHARED_LIB_KWS, TEST_KWS, TEST_KWS_NO_ARGS, - OUTPUT_KW, INCLUDE_DIRECTORIES, SOURCES_VARARGS, NoneType, in_set_validator + OUTPUT_KW, INCLUDE_DIRECTORIES, SOURCES_VARARGS, NoneType, in_set_validator, + EXECUTABLE_KWS, LIBRARY_KWS, SHARED_MOD_KWS, _BASE_LANG_KW ) -from ..interpreterbase import ContainerTypeInfo, InterpreterException, KwargInfo, typed_kwargs, typed_pos_args, noPosargs, permittedKwargs +from ..interpreterbase import ContainerTypeInfo, InterpreterException, KwargInfo, typed_kwargs, typed_pos_args, noKwargs, noPosargs, permittedKwargs from ..interpreter.interpreterobjects import Doctest -from ..mesonlib import File, MachineChoice, MesonException, PerMachine +from ..mesonlib import (is_parent_path, File, MachineChoice, MesonException, PerMachine) from ..programs import ExternalProgram, NonExistingExternalProgram if T.TYPE_CHECKING: from . import ModuleState from ..build import BuildTargetTypes, ExecutableKeywordArguments, IncludeDirs, LibTypes + from .. import cargo + from ..cargo.interpreter import RUST_ABI from ..compilers.rust import RustCompiler - from ..dependencies import Dependency, ExternalLibrary + from ..dependencies import ExternalLibrary from ..interpreter import Interpreter from ..interpreter import kwargs as _kwargs from ..interpreter.interpreter import SourceInputs, SourceOutputs from ..interpreter.interpreterobjects import Test + from ..interpreterbase import TYPE_kwargs from ..programs import OverrideProgram from ..interpreter.type_checking import SourcesVarargsType @@ -63,6 +70,22 @@ if T.TYPE_CHECKING: language: T.Optional[Literal['c', 'cpp']] bindgen_version: T.List[str] + class FuncWorkspace(TypedDict): + default_features: T.Optional[bool] + features: T.List[str] + + class FuncDependency(TypedDict): + rust_abi: T.Optional[RUST_ABI] + + class RustPackageExecutable(_kwargs.Executable): + dependencies: T.List[T.Union[Dependency, ExternalLibrary]] + link_with: T.List[LibTypes] + link_whole: T.List[LibTypes] + + class RustPackageLibrary(_kwargs.Library): + dependencies: T.List[T.Union[Dependency, ExternalLibrary]] + link_with: T.List[LibTypes] + link_whole: T.List[LibTypes] RUST_TEST_KWS: T.List[KwargInfo] = [ KwargInfo( @@ -80,6 +103,402 @@ def no_spaces_validator(arg: T.Optional[T.Union[str, T.List]]) -> T.Optional[str return 'must not contain spaces due to limitations of rustdoc' return None +def dep_to_system_dependency(dep: Dependency, depname: str) -> Dependency: + if not dep.found(): + return dep + if not depname: + if not dep.name: + raise MesonException("rust.to_system_dependency() called with an unnamed dependency and no explicit name") + depname = dep.name + depname = re.sub(r'[^a-zA-Z0-9]', '_', depname) + rust_args = ['--cfg', f'system_deps_have_{depname}'] + return RustSystemDependency(dep.version, compile_args=rust_args, ext_deps=[dep], name=dep.name) + +class RustWorkspace(ModuleObject): + """Represents a Rust workspace, controlling the build of packages + recorded in a Cargo.lock file.""" + + def __init__(self, interpreter: Interpreter, ws: cargo.WorkspaceState) -> None: + super().__init__() + self.interpreter = interpreter + self.ws = ws + self.methods.update({ + 'packages': self.packages_method, + 'package': self.package_method, + 'subproject': self.subproject_method, + }) + + @property + def subdir(self) -> str: + return self.ws.subdir + + @noPosargs + @noKwargs + def packages_method(self, state: ModuleState, args: T.List, kwargs: TYPE_kwargs) -> T.List[str]: + """Returns list of package names in workspace.""" + package_names = [pkg.manifest.package.name for pkg in self.ws.packages.values()] + return sorted(package_names) + + @typed_pos_args('workspace.package', optargs=[str]) + def package_method(self, state: 'ModuleState', args: T.List, kwargs: TYPE_kwargs) -> RustPackage: + """Returns a package object.""" + package_name = args[0] if args else None + return RustPackage(self, self.interpreter.cargo.load_package(self.ws, package_name)) + + def _do_subproject(self, pkg: cargo.PackageState) -> None: + kw: _kwargs.DoSubproject = { + 'required': True, + 'version': None, + 'options': None, + 'cmake_options': [], + 'default_options': {}, + } + subp_name = pkg.get_subproject_name() + self.interpreter.do_subproject(subp_name, kw, force_method='cargo') + + @typed_pos_args('workspace.subproject', str, optargs=[str]) + @noKwargs + def subproject_method(self, state: ModuleState, args: T.Tuple[str, T.Optional[str]], kwargs: TYPE_kwargs) -> RustSubproject: + """Returns a package object for a subproject package.""" + package_name = args[0] + pkg = self.interpreter.cargo.resolve_package(package_name, args[1] or '') + if pkg is None: + if args[1]: + raise MesonException(f'No version of cargo package "{package_name}" provides API {args[1]}') + else: + raise MesonException(f'Cargo package "{package_name}" not available') + + self._do_subproject(pkg) + return RustSubproject(self, pkg) + + +class RustCrate(ModuleObject): + """Abstract base class for Rust crate representations.""" + + def __init__(self, rust_ws: RustWorkspace, package: cargo.PackageState) -> None: + super().__init__() + self.rust_ws = rust_ws + self.package = package + self.methods.update({ + 'all_features': self.all_features_method, + 'api': self.api_method, + 'features': self.features_method, + 'name': self.name_method, + 'version': self.version_method, + 'rust_args': self.rust_args_method, + 'env': self.env_method, # type: ignore[dict-item] + 'rust_dependency_map': self.rust_dependency_map_method, # type: ignore[dict-item] + }) + + @noPosargs + @noKwargs + def name_method(self, state: ModuleState, args: T.List, kwargs: TYPE_kwargs) -> str: + """Returns the name of the package.""" + return self.package.manifest.package.name + + @noPosargs + @noKwargs + def api_method(self, state: ModuleState, args: T.List, kwargs: TYPE_kwargs) -> str: + """Returns the API version of the package.""" + return self.package.manifest.package.api + + @noPosargs + @noKwargs + def version_method(self, state: ModuleState, args: T.List, kwargs: TYPE_kwargs) -> str: + """Returns the version of the package.""" + return self.package.manifest.package.version + + @noPosargs + @noKwargs + def all_features_method(self, state: ModuleState, args: T.List, kwargs: TYPE_kwargs) -> T.List[str]: + """Returns all features for specific package.""" + return sorted(list(self.package.manifest.features.keys())) + + @noPosargs + @noKwargs + def features_method(self, state: ModuleState, args: T.List, kwargs: TYPE_kwargs) -> T.List[str]: + """Returns chosen features for specific package.""" + return sorted(list(self.package.cfg.features)) + + @noPosargs + @noKwargs + def rust_args_method(self, state: ModuleState, args: T.List, kwargs: TYPE_kwargs) -> T.List[str]: + """Returns rustc arguments for this package.""" + return self.package.get_rustc_args(state.environment, state.subdir, mesonlib.MachineChoice.HOST) + + @noPosargs + @noKwargs + def env_method(self, state: ModuleState, args: T.List, kwargs: TYPE_kwargs) -> T.Dict[str, str]: + """Returns environment variables for this package.""" + return self.package.get_env_dict(state.environment, state.subdir) + + @noPosargs + @noKwargs + def rust_dependency_map_method(self, state: ModuleState, args: T.List, kwargs: TYPE_kwargs) -> T.Dict[str, str]: + """Returns rust dependency mapping for this package.""" + return self.package.cfg.get_dependency_map(self.package.manifest) + + +class RustPackage(RustCrate): + """Represents a Rust package within a workspace.""" + + def __init__(self, rust_ws: RustWorkspace, package: cargo.PackageState) -> None: + super().__init__(rust_ws, package) + self.methods.update({ + 'dependencies': self.dependencies_method, + 'library': self.library_method, + 'proc_macro': self.proc_macro_method, + 'shared_module': self.shared_module_method, + 'executable': self.executable_method, + 'override_dependency': self.override_dependency_method, + }) + + @noPosargs + @typed_kwargs('package.dependencies', + KwargInfo('dependencies', bool, default=True), + KwargInfo('dev_dependencies', bool, default=False), + KwargInfo('system_dependencies', bool, default=True)) + def dependencies_method(self, state: ModuleState, args: T.List, kwargs: T.Dict[str, T.Any]) -> T.List[Dependency]: + """Returns the dependencies for this package.""" + dependencies: T.List[Dependency] = [] + cfg = self.package.cfg + + if kwargs['dependencies']: + for dep_key, dep_pkg in cfg.dep_packages.items(): + if dep_pkg.manifest.lib: + if dep_pkg.ws_subdir != self.rust_ws.subdir or \ + is_parent_path(os.path.join(self.rust_ws.subdir, state.subproject_dir), + dep_pkg.path): + self.rust_ws._do_subproject(dep_pkg) + # Get the dependency name for this package + depname = dep_pkg.get_dependency_name(None) + dependency = state.overridden_dependency(depname) + dependencies.append(dependency) + + if kwargs['dev_dependencies']: + raise MesonException('dev_dependencies is not implemented yet') + + if kwargs['system_dependencies']: + for name, sys_dep in self.package.manifest.system_dependencies.items(): + if sys_dep.enabled(cfg.features): + # System dependencies use the original dependency name from Cargo.toml + dependency = state.dependency(sys_dep.name, required=not sys_dep.optional, + wanted=sys_dep.meson_version) + dependencies.append(dep_to_system_dependency(dependency, name)) + + return dependencies + + @staticmethod + def validate_pos_args(name: str, args: T.Tuple[ + T.Optional[T.Union[str, StructuredSources]], + T.Optional[StructuredSources]]) -> T.Tuple[T.Optional[str], T.Optional[StructuredSources]]: + if isinstance(args[0], str): + return args[0], args[1] + if args[1] is not None: + raise MesonException(f"{name} only accepts one StructuredSources parameter") + return None, args[0] + + def merge_kw_args(self, state: ModuleState, kwargs: T.Union[RustPackageExecutable, RustPackageLibrary]) -> None: + deps = kwargs['dependencies'] + kwargs['dependencies'] = self.dependencies_method(state, [], {}) + kwargs['dependencies'].extend(deps) + + depmap = kwargs['rust_dependency_map'] + kwargs['rust_dependency_map'] = self.rust_dependency_map_method(state, [], {}) + kwargs['rust_dependency_map'].update(depmap) + + rust_args = kwargs['rust_args'] + kwargs['rust_args'] = self.rust_args_method(state, [], {}) + kwargs['rust_args'].extend(rust_args) + + kwargs['override_options'].setdefault('rust_std', self.package.manifest.package.edition) + + def _library_method(self, state: ModuleState, args: T.Tuple[ + T.Optional[T.Union[str, StructuredSources]], + T.Optional[StructuredSources]], kwargs: RustPackageLibrary, + static: bool, shared: bool, + shared_mod: bool = False) -> T.Union[BothLibraries, SharedLibrary, StaticLibrary]: + tgt_args = self.validate_pos_args('package.library', args) + if not self.package.manifest.lib: + raise MesonException("no [lib] section in Cargo package") + + sources: T.Union[StructuredSources, str] + tgt_name, sources = tgt_args + if not tgt_name: + rust_abi: RUST_ABI + if kwargs['rust_crate_type'] is not None: + rust_abi = 'rust' if kwargs['rust_crate_type'] in {'lib', 'rlib', 'dylib', 'proc-macro'} else 'c' + else: + rust_abi = kwargs['rust_abi'] + tgt_name = self.package.library_name(rust_abi) + if not sources: + sources = self.package.manifest.lib.path + + lib_args: T.Tuple[str, SourcesVarargsType] = (tgt_name, [sources]) + self.merge_kw_args(state, kwargs) + + if shared_mod: + return state._interpreter.build_target(state.current_node, lib_args, + T.cast('_kwargs.SharedModule', kwargs), + SharedModule) + + if static and shared: + return state._interpreter.build_both_libraries(state.current_node, lib_args, kwargs) + elif shared: + return state._interpreter.build_target(state.current_node, lib_args, + T.cast('_kwargs.SharedLibrary', kwargs), + SharedLibrary) + else: + return state._interpreter.build_target(state.current_node, lib_args, + T.cast('_kwargs.StaticLibrary', kwargs), + StaticLibrary) + + def _proc_macro_method(self, state: 'ModuleState', args: T.Tuple[ + T.Optional[T.Union[str, StructuredSources]], + T.Optional[StructuredSources]], kwargs: RustPackageLibrary) -> SharedLibrary: + kwargs['native'] = MachineChoice.BUILD + kwargs['rust_abi'] = None + kwargs['rust_crate_type'] = 'proc-macro' + kwargs['rust_args'] = kwargs['rust_args'] + ['--extern', 'proc_macro'] + result = self._library_method(state, args, kwargs, shared=True, static=False) + return T.cast('SharedLibrary', result) + + @typed_pos_args('package.override_dependency', Dependency) + @typed_kwargs('package.override_dependency', + KwargInfo('rust_abi', (str, NoneType), default=None, validator=in_set_validator({'rust', 'c', 'proc-macro'}))) + def override_dependency_method(self, state: ModuleState, args: T.Tuple[Dependency], kwargs: FuncDependency) -> None: + dep = args[0] + rust_abi = self.package.abi_resolve_default(kwargs['rust_abi']) + depname = self.package.get_dependency_name(rust_abi) + state.override_dependency(depname, dep) + if self.package.abi_has_static(rust_abi): + state.override_dependency(depname, dep, static=True) + if self.package.abi_has_shared(rust_abi): + state.override_dependency(depname, dep, static=False) + + @typed_pos_args('package.library', optargs=[(str, StructuredSources), StructuredSources]) + @typed_kwargs( + 'package.library', + *LIBRARY_KWS, + DEPENDENCIES_KW, + LINK_WITH_KW, + LINK_WHOLE_KW, + _BASE_LANG_KW.evolve(name='rust_args'), + ) + def library_method(self, state: ModuleState, args: T.Tuple[ + T.Optional[T.Union[str, StructuredSources]], + T.Optional[StructuredSources]], kwargs: RustPackageLibrary) -> T.Union[BothLibraries, SharedLibrary, StaticLibrary]: + if not self.package.manifest.lib: + raise MesonException("no [lib] section in Cargo package") + if kwargs['rust_crate_type'] is not None: + static = kwargs['rust_crate_type'] in {'lib', 'rlib', 'staticlib'} + shared = kwargs['rust_crate_type'] in {'dylib', 'cdylib', 'proc-macro'} + else: + rust_abi = self.package.abi_resolve_default(kwargs['rust_abi']) + static = self.package.abi_has_static(rust_abi) + shared = self.package.abi_has_shared(rust_abi) + if rust_abi == 'proc-macro': + kwargs['rust_crate_type'] = 'proc-macro' + kwargs['rust_abi'] = None + else: + kwargs['rust_abi'] = rust_abi + return self._library_method(state, args, kwargs, static=static, shared=shared) + + @typed_pos_args('package.proc_macro', optargs=[(str, StructuredSources), StructuredSources]) + @typed_kwargs( + 'package.proc_macro', + *SHARED_LIB_KWS, + DEPENDENCIES_KW, + LINK_WITH_KW, + LINK_WHOLE_KW, + _BASE_LANG_KW.evolve(name='rust_args'), + ) + def proc_macro_method(self, state: 'ModuleState', args: T.Tuple[ + T.Optional[T.Union[str, StructuredSources]], + T.Optional[StructuredSources]], kwargs: RustPackageLibrary) -> SharedLibrary: + if not self.package.manifest.lib: + raise MesonException("no [lib] section in Cargo package") + if 'proc-macro' not in self.package.manifest.lib.crate_type: + raise MesonException("not a procedural macro crate") + return self._proc_macro_method(state, args, kwargs) + + @typed_pos_args('package.shared_module', optargs=[(str, StructuredSources), StructuredSources]) + @typed_kwargs( + 'package.shared_module', + *SHARED_MOD_KWS, + DEPENDENCIES_KW, + LINK_WITH_KW, + LINK_WHOLE_KW, + _BASE_LANG_KW.evolve(name='rust_args'), + ) + def shared_module_method(self, state: 'ModuleState', args: T.Tuple[ + T.Optional[T.Union[str, StructuredSources]], + T.Optional[StructuredSources]], kwargs: RustPackageLibrary) -> SharedModule: + if not self.package.manifest.lib: + raise MesonException("no [lib] section in Cargo package") + if 'cdylib' not in self.package.manifest.lib.crate_type: + raise MesonException("not a cdylib crate") + + kwargs['rust_abi'] = None + kwargs['rust_crate_type'] = 'cdylib' + result = self._library_method(state, args, kwargs, shared=True, static=False, shared_mod=True) + return T.cast('SharedModule', result) + + @typed_pos_args('package.executable', optargs=[(str, StructuredSources), StructuredSources]) + @typed_kwargs( + 'package.executable', + *EXECUTABLE_KWS, + DEPENDENCIES_KW, + LINK_WITH_KW, + LINK_WHOLE_KW, + _BASE_LANG_KW.evolve(name='rust_args'), + ) + def executable_method(self, state: 'ModuleState', args: T.Tuple[ + T.Optional[T.Union[str, StructuredSources]], + T.Optional[StructuredSources]], kwargs: RustPackageExecutable) -> Executable: + """Builds executable targets from workspace bins.""" + tgt_args = self.validate_pos_args('package.executable', args) + if not self.package.manifest.bin: + raise MesonException("no [[bin]] section in Cargo package") + + sources: T.Union[StructuredSources, str] + tgt_name, sources = tgt_args + # If there's more than one binary, the first argument must be specified + # and must be one of the keys in pkg.bin + if not tgt_name: + if len(self.package.manifest.bin) > 1: + raise MesonException("Package has multiple binaries, you must specify which one to build as the first argument") + # Single binary, use it + tgt_name = next(iter(self.package.manifest.bin.keys())) + else: + if tgt_name not in self.package.manifest.bin: + raise MesonException(f"Binary '{tgt_name}' not found.") + + if not sources: + sources = self.package.manifest.bin[tgt_name].path + + exe_args: T.Tuple[str, SourcesVarargsType] = (tgt_name, [sources]) + self.merge_kw_args(state, kwargs) + return state._interpreter.build_target(state.current_node, exe_args, kwargs, Executable) + + +class RustSubproject(RustCrate): + """Represents a Cargo subproject.""" + + def __init__(self, rust_ws: RustWorkspace, package: cargo.PackageState) -> None: + super().__init__(rust_ws, package) + self.methods.update({ + 'dependency': self.dependency_method, + }) + + @noPosargs + @typed_kwargs('package.dependency', + KwargInfo('rust_abi', (str, NoneType), default=None, validator=in_set_validator({'rust', 'c', 'proc-macro'}))) + def dependency_method(self, state: ModuleState, args: T.List, kwargs: FuncDependency) -> Dependency: + """Returns dependency for the package with the given ABI.""" + depname = self.package.get_dependency_name(kwargs['rust_abi']) + return state.overridden_dependency(depname) + class RustModule(ExtensionModule): @@ -103,6 +522,8 @@ class RustModule(ExtensionModule): 'doctest': self.doctest, 'bindgen': self.bindgen, 'proc_macro': self.proc_macro, + 'to_system_dependency': self.to_system_dependency, + 'workspace': self.workspace, }) def test_common(self, funcname: str, state: ModuleState, args: T.Tuple[str, BuildTarget], kwargs: FuncRustTest) -> T.Tuple[Executable, _kwargs.FuncTest]: @@ -500,6 +921,49 @@ class RustModule(ExtensionModule): target = state._interpreter.build_target(state.current_node, args, kwargs, SharedLibrary) return target + @FeatureNew('rust.to_system_dependency', '1.11.0') + @typed_pos_args('rust.to_system_dependency', Dependency, optargs=[str]) + @noKwargs + def to_system_dependency(self, state: ModuleState, args: T.Tuple[Dependency, T.Optional[str]], kwargs: TYPE_kwargs) -> Dependency: + dep, depname = args + return dep_to_system_dependency(dep, depname) + + @FeatureNew('rust.workspace', '1.11.0') + @noPosargs + @typed_kwargs( + 'rust.workspace', + KwargInfo('default_features', (bool, NoneType), default=None), + KwargInfo( + 'features', + (ContainerTypeInfo(list, str), NoneType), + default=None, + listify=True, + ), + ) + def workspace(self, state: ModuleState, args: T.List, kwargs: FuncWorkspace) -> RustWorkspace: + """Creates a Rust workspace object, controlling the build of + all the packages in a Cargo.lock file.""" + if self.interpreter.cargo is None: + raise MesonException("rust.workspace() requires a Cargo project (Cargo.toml and Cargo.lock)") + + self.interpreter.add_languages(['rust'], True, MachineChoice.HOST) + self.interpreter.add_languages(['rust'], True, MachineChoice.BUILD) + + default_features = kwargs['default_features'] + features = kwargs['features'] + if default_features is not None or features is not None: + # If custom features are provided, default_features = None should be treated as True + if default_features is None: + default_features = True + + cargo_features = ['default'] if default_features else [] + if features is not None: + cargo_features.extend(features) + self.interpreter.cargo.features = cargo_features + + ws = self.interpreter.cargo.load_workspace(state.root_subdir) + return RustWorkspace(self.interpreter, ws) + def initialize(interp: Interpreter) -> RustModule: return RustModule(interp) diff --git a/mesonbuild/options.py b/mesonbuild/options.py index 8e29f29f3..5832461f0 100644 --- a/mesonbuild/options.py +++ b/mesonbuild/options.py @@ -289,15 +289,21 @@ class OptionKey: def as_root(self) -> OptionKey: """Convenience method for key.evolve(subproject='').""" - return self.evolve(subproject='') + if self.subproject != '': + return self.evolve(subproject='') + return self def as_build(self) -> OptionKey: """Convenience method for key.evolve(machine=MachineChoice.BUILD).""" - return self.evolve(machine=MachineChoice.BUILD) + if self.machine != MachineChoice.BUILD: + return self.evolve(machine=MachineChoice.BUILD) + return self def as_host(self) -> OptionKey: """Convenience method for key.evolve(machine=MachineChoice.HOST).""" - return self.evolve(machine=MachineChoice.HOST) + if self.machine != MachineChoice.HOST: + return self.evolve(machine=MachineChoice.HOST) + return self def has_module_prefix(self) -> bool: return '.' in self.name @@ -835,7 +841,7 @@ class OptionStore: # # I did not do this yet, because it would make this MR even # more massive than it already is. Later then. - if not self.is_cross and key.machine == MachineChoice.BUILD: + if not (self.is_cross and self.is_per_machine_option(key)): key = key.as_host() return key diff --git a/test cases/common/287 invalid dependency arguments/lib.c b/test cases/common/287 invalid dependency arguments/lib.c new file mode 100644 index 000000000..a324dca21 --- /dev/null +++ b/test cases/common/287 invalid dependency arguments/lib.c @@ -0,0 +1 @@ +int func(void) { return 0; } diff --git a/test cases/failing/124 subproject object as a dependency/main.c b/test cases/common/287 invalid dependency arguments/main.c index 78f2de106..78f2de106 100644 --- a/test cases/failing/124 subproject object as a dependency/main.c +++ b/test cases/common/287 invalid dependency arguments/main.c diff --git a/test cases/common/287 invalid dependency arguments/meson.build b/test cases/common/287 invalid dependency arguments/meson.build new file mode 100644 index 000000000..e85f181c1 --- /dev/null +++ b/test cases/common/287 invalid dependency arguments/meson.build @@ -0,0 +1,11 @@ +project('test', 'c') + +testcase expect_error('executable keyword argument \'dependencies\' was of type array[SubprojectHolder] but should have been array[Dependency | InternalDependency]') + executable('main', 'main.c', dependencies: subproject('sub')) +endtestcase + +lib = static_library('lib', 'lib.c') + +testcase expect_error('executable keyword argument \'dependencies\' was of type array[StaticLibrary] but should have been array[Dependency | InternalDependency]. Tried to use a build_target "lib" as a dependency. This should be in `link_with` or `link_whole` instead.') + executable('main', 'main.c', dependencies : lib) +endtestcase diff --git a/test cases/failing/124 subproject object as a dependency/subprojects/sub/meson.build b/test cases/common/287 invalid dependency arguments/subprojects/sub/meson.build index 0adfd6a6e..0adfd6a6e 100644 --- a/test cases/failing/124 subproject object as a dependency/subprojects/sub/meson.build +++ b/test cases/common/287 invalid dependency arguments/subprojects/sub/meson.build diff --git a/test cases/failing/100 no fallback/test.json b/test cases/failing/100 no fallback/test.json index 5fbffe35d..d3f7345b0 100644 --- a/test cases/failing/100 no fallback/test.json +++ b/test cases/failing/100 no fallback/test.json @@ -2,7 +2,7 @@ "stdout": [ { "match": "re", - "line": ".*/meson\\.build:2:11: ERROR: (Pkg-config binary for machine MachineChoice\\.HOST not found\\. Giving up\\.|Dependency \"foob\" not found, tried .*)" + "line": ".*/meson\\.build:2:11: ERROR: (Pkg-config binary for machine MachineChoice\\.HOST not found\\. Giving up\\.|Dependency \"foob\" not found \\(tried .*\\))" } ] } diff --git a/test cases/failing/124 subproject object as a dependency/meson.build b/test cases/failing/124 subproject object as a dependency/meson.build deleted file mode 100644 index 0114b9a31..000000000 --- a/test cases/failing/124 subproject object as a dependency/meson.build +++ /dev/null @@ -1,4 +0,0 @@ -project('test', 'c') - -executable( - 'main', 'main.c', dependencies: subproject('sub')) diff --git a/test cases/failing/124 subproject object as a dependency/test.json b/test cases/failing/124 subproject object as a dependency/test.json deleted file mode 100644 index a0eea225f..000000000 --- a/test cases/failing/124 subproject object as a dependency/test.json +++ /dev/null @@ -1,7 +0,0 @@ -{ - "stdout": [ - { - "line": "test cases/failing/124 subproject object as a dependency/meson.build:3:0: ERROR: Tried to use subproject object as a dependency." - } - ] -} diff --git a/test cases/failing/136 cargo toml error/meson.build b/test cases/failing/136 cargo toml error/meson.build new file mode 100644 index 000000000..6efaab2c5 --- /dev/null +++ b/test cases/failing/136 cargo toml error/meson.build @@ -0,0 +1,19 @@ +project('cargo-toml-error', 'c') + +if not add_languages('rust', required: false) + error('MESON_SKIP_TEST Rust not present, required for Cargo subprojets') +endif + +# Check if we have tomllib/tomli (not toml2json) +python = find_program('python3', 'python') +result = run_command(python, '-c', 'import tomllib', check: false) +if result.returncode() != 0 + result = run_command(python, '-c', 'import tomli', check: false) + if result.returncode() != 0 + # Skip test if using toml2json - error format will be different + error('MESON_SKIP_TEST toml2json in use, skipping test') + endif +endif + +# This should trigger a CargoTomlError with proper location info +foo_dep = dependency('foo-0') diff --git a/test cases/failing/136 cargo toml error/subprojects/foo-0-rs.wrap b/test cases/failing/136 cargo toml error/subprojects/foo-0-rs.wrap new file mode 100644 index 000000000..c39970188 --- /dev/null +++ b/test cases/failing/136 cargo toml error/subprojects/foo-0-rs.wrap @@ -0,0 +1,5 @@ +[wrap-file] +method = cargo + +[provide] +dependency_names = foo-0 diff --git a/test cases/failing/136 cargo toml error/subprojects/foo-0-rs/Cargo.toml b/test cases/failing/136 cargo toml error/subprojects/foo-0-rs/Cargo.toml new file mode 100644 index 000000000..2f2d7a681 --- /dev/null +++ b/test cases/failing/136 cargo toml error/subprojects/foo-0-rs/Cargo.toml @@ -0,0 +1,6 @@ +[package] +name = "foo" +version = "0.0.1" +edition = "2021" +# This creates a TOML decode error: duplicate key +name = "bar"
\ No newline at end of file diff --git a/test cases/failing/136 cargo toml error/test.json b/test cases/failing/136 cargo toml error/test.json new file mode 100644 index 000000000..480cf64d5 --- /dev/null +++ b/test cases/failing/136 cargo toml error/test.json @@ -0,0 +1,8 @@ +{ + "stdout": [ + { + "match": "re", + "line": "test cases/failing/136 cargo toml error/(subprojects/foo-0-rs/Cargo\\.toml:6:13|meson\\.build:19:10): ERROR: Cannot overwrite a value( \\(at.*)?" + } + ] +} diff --git a/test cases/failing/33 dependency not-required then required/test.json b/test cases/failing/33 dependency not-required then required/test.json index 7dd851956..daf4f351a 100644 --- a/test cases/failing/33 dependency not-required then required/test.json +++ b/test cases/failing/33 dependency not-required then required/test.json @@ -2,7 +2,7 @@ "stdout": [ { "match": "re", - "line": ".*/meson\\.build:4:10: ERROR: (Pkg-config binary for machine MachineChoice\\.HOST not found\\. Giving up\\.|Dependency \"foo\\-bar\\-xyz\\-12\\.3\" not found, tried .*)" + "line": ".*/meson\\.build:4:10: ERROR: (Pkg-config binary for machine MachineChoice\\.HOST not found\\. Giving up\\.|Dependency \"foo\\-bar\\-xyz\\-12\\.3\" not found \\(tried .*\\))" } ] } diff --git a/test cases/failing/40 custom target outputs not matching install_dirs/test.json b/test cases/failing/40 custom target outputs not matching install_dirs/test.json index f9e2ba781..e6ea59770 100644 --- a/test cases/failing/40 custom target outputs not matching install_dirs/test.json +++ b/test cases/failing/40 custom target outputs not matching install_dirs/test.json @@ -27,7 +27,8 @@ ], "stdout": [ { - "line": "ERROR: Target 'too-few-install-dirs' has 3 outputs: ['toofew.h', 'toofew.c', 'toofew.sh'], but only 2 \"install_dir\"s were found." + "line": "ERROR: Target 'too-few-install-dirs' has 3 outputs: \\['toofew.h', 'toofew.c', 'toofew.sh'\\], but 2 \"install_dir\"s were found: \\['([a-zA-Z]:)?/usr/include', False\\]\\.", + "match": "re" } ] } diff --git a/test cases/failing/52 link with executable/test.json b/test cases/failing/52 link with executable/test.json index ba9c34549..976d96706 100644 --- a/test cases/failing/52 link with executable/test.json +++ b/test cases/failing/52 link with executable/test.json @@ -1,7 +1,7 @@ { "stdout": [ { - "line": "test cases/failing/52 link with executable/meson.build:4:4: ERROR: Link target 'prog' is not linkable." + "line": "test cases/failing/52 link with executable/meson.build:4:4: ERROR: shared_module keyword argument \"link_with\" Link target \"prog\" is not linkable" } ] } diff --git a/test cases/failing/59 string as link target/test.json b/test cases/failing/59 string as link target/test.json index ddc639980..2ecfaf02a 100644 --- a/test cases/failing/59 string as link target/test.json +++ b/test cases/failing/59 string as link target/test.json @@ -1,7 +1,7 @@ { "stdout": [ { - "line": "test cases/failing/59 string as link target/meson.build:2:0: ERROR: '' is not a target." + "line": "test cases/failing/59 string as link target/meson.build:2:0: ERROR: executable keyword argument 'link_with' was of type array[str] but should have been array[BothLibraries | SharedLibrary | StaticLibrary | CustomTarget | CustomTargetIndex | Jar | Executable]" } ] } diff --git a/test cases/failing/62 wrong boost module/test.json b/test cases/failing/62 wrong boost module/test.json index 75ef82b7e..2765ce917 100644 --- a/test cases/failing/62 wrong boost module/test.json +++ b/test cases/failing/62 wrong boost module/test.json @@ -1,7 +1,7 @@ { "stdout": [ { - "line": "test cases/failing/62 wrong boost module/meson.build:9:10: ERROR: Dependency \"boost\" not found, tried system" + "line": "test cases/failing/62 wrong boost module/meson.build:9:10: ERROR: Dependency \"boost\" not found (tried system)" } ] } diff --git a/test cases/failing/79 dub library/test.json b/test cases/failing/79 dub library/test.json index 9f59604a4..ea4e73490 100644 --- a/test cases/failing/79 dub library/test.json +++ b/test cases/failing/79 dub library/test.json @@ -1,7 +1,7 @@ { "stdout": [ { - "line": "test cases/failing/79 dub library/meson.build:11:0: ERROR: Dependency \"dubtestproject\" not found" + "line": "test cases/failing/79 dub library/meson.build:11:0: ERROR: Dependency \"dubtestproject\" not found (tried dub)" } ] } diff --git a/test cases/failing/80 dub executable/test.json b/test cases/failing/80 dub executable/test.json index edb74f62c..97dab9f87 100644 --- a/test cases/failing/80 dub executable/test.json +++ b/test cases/failing/80 dub executable/test.json @@ -1,7 +1,7 @@ { "stdout": [ { - "line": "test cases/failing/80 dub executable/meson.build:11:0: ERROR: Dependency \"dubtestproject:test1\" not found" + "line": "test cases/failing/80 dub executable/meson.build:11:0: ERROR: Dependency \"dubtestproject:test1\" not found (tried dub)" } ] } diff --git a/test cases/failing/81 dub compiler/test.json b/test cases/failing/81 dub compiler/test.json index d82984d5d..1018d478b 100644 --- a/test cases/failing/81 dub compiler/test.json +++ b/test cases/failing/81 dub compiler/test.json @@ -13,7 +13,7 @@ }, "stdout": [ { - "line": "test cases/failing/81 dub compiler/meson.build:17:0: ERROR: Dependency \"dubtestproject:test2\" not found" + "line": "test cases/failing/81 dub compiler/meson.build:17:0: ERROR: Dependency \"dubtestproject:test2\" not found (tried dub)" } ] } diff --git a/test cases/rust/31 rust.workspace package/Cargo.lock b/test cases/rust/31 rust.workspace package/Cargo.lock new file mode 100644 index 000000000..989f6ff5b --- /dev/null +++ b/test cases/rust/31 rust.workspace package/Cargo.lock @@ -0,0 +1,19 @@ +# This file is automatically @generated by Cargo. +# It is not intended for manual editing. +version = 4 + +[[package]] +name = "answer" +version = "2.1.0" + +[[package]] +name = "hello" +version = "1.0.0" + +[[package]] +name = "package_test" +version = "0.1.0" +dependencies = [ + "answer", + "hello", +] diff --git a/test cases/rust/31 rust.workspace package/Cargo.toml b/test cases/rust/31 rust.workspace package/Cargo.toml new file mode 100644 index 000000000..00bb0878e --- /dev/null +++ b/test cases/rust/31 rust.workspace package/Cargo.toml @@ -0,0 +1,13 @@ +[package] +name = "package_test" +version = "0.1.0" +edition = "2021" + +[features] +default = ["feature1", "hello?/goodbye"] +feature1 = ["answer/large", "dep:hello"] +feature2 = [] + +[dependencies] +hello = { version = "1.0", path = "subprojects/hello-1.0", optional = true } +answer = { version = "2.1", path = "subprojects/answer-2.1", optional = true } diff --git a/test cases/rust/31 rust.workspace package/meson.build b/test cases/rust/31 rust.workspace package/meson.build new file mode 100644 index 000000000..d00a6096a --- /dev/null +++ b/test cases/rust/31 rust.workspace package/meson.build @@ -0,0 +1,51 @@ +project('package test', 'rust', default_options: ['rust_std=2021']) + +rust = import('rust') +cargo_ws = rust.workspace() + +# Test workspace.packages() method +assert(cargo_ws.packages() == ['answer', 'hello', 'package_test']) + +main_pkg = cargo_ws.package() +assert(main_pkg.name() == 'package_test') +assert(main_pkg.version() == '0.1.0') +assert(main_pkg.api() == '0.1') +assert(main_pkg.all_features() == ['answer', 'default', 'feature1', 'feature2']) +assert(main_pkg.features() == ['default', 'feature1']) + +hello_rs = cargo_ws.subproject('hello') +assert(hello_rs.name() == 'hello') +assert(hello_rs.version() == '1.0.0') +assert(hello_rs.api() == '1') +assert(hello_rs.all_features() == ['default', 'goodbye']) +assert(hello_rs.features() == ['default', 'goodbye']) + +e = executable('package-test', 'src/main.rs', + dependencies: main_pkg.dependencies(), + rust_args: main_pkg.rust_args(), + rust_dependency_map: main_pkg.rust_dependency_map(), +) +test('package-test', e) + +answer_rs = cargo_ws.subproject('answer', '2') +assert(answer_rs.name() == 'answer') +assert(answer_rs.version() == '2.1.0') +assert(answer_rs.api() == '2') +assert(answer_rs.all_features() == ['default', 'large']) +assert(answer_rs.features() == ['default', 'large']) + +# failure test cases for package() +testcase expect_error('argument to package() cannot be a subproject') + cargo_ws.package('hello') +endtestcase +testcase expect_error('workspace member "nonexistent" not found') + cargo_ws.package('nonexistent') +endtestcase + +# failure test cases for dependency() +testcase expect_error('package.dependency.*must be one of c, proc-macro, rust.*', how: 're') + hello_rs.dependency(rust_abi: 'something else') +endtestcase +testcase expect_error('Package hello does not support ABI c') + hello_rs.dependency(rust_abi: 'c') +endtestcase diff --git a/test cases/rust/31 rust.workspace package/src/main.rs b/test cases/rust/31 rust.workspace package/src/main.rs new file mode 100644 index 000000000..13c02dd64 --- /dev/null +++ b/test cases/rust/31 rust.workspace package/src/main.rs @@ -0,0 +1,8 @@ +use hello::{farewell, greet}; + +fn main() { + println!("{}", greet()); + println!("{}", farewell()); + println!("{}", answer::answer()); + println!("{}", answer::large_answer()); +} diff --git a/test cases/rust/31 rust.workspace package/subprojects/answer-2-rs.wrap b/test cases/rust/31 rust.workspace package/subprojects/answer-2-rs.wrap new file mode 100644 index 000000000..d16d1f7a8 --- /dev/null +++ b/test cases/rust/31 rust.workspace package/subprojects/answer-2-rs.wrap @@ -0,0 +1,2 @@ +[wrap-file] +directory = answer-2.1 diff --git a/test cases/rust/31 rust.workspace package/subprojects/answer-2.1/Cargo.toml b/test cases/rust/31 rust.workspace package/subprojects/answer-2.1/Cargo.toml new file mode 100644 index 000000000..b87782264 --- /dev/null +++ b/test cases/rust/31 rust.workspace package/subprojects/answer-2.1/Cargo.toml @@ -0,0 +1,10 @@ +[package] +name = "answer" +version = "2.1.0" +edition = "2021" + +[lib] +crate-type = ["lib"] + +[features] +large = [] diff --git a/test cases/rust/31 rust.workspace package/subprojects/answer-2.1/meson.build b/test cases/rust/31 rust.workspace package/subprojects/answer-2.1/meson.build new file mode 100644 index 000000000..d78795602 --- /dev/null +++ b/test cases/rust/31 rust.workspace package/subprojects/answer-2.1/meson.build @@ -0,0 +1,15 @@ +project('answer', 'rust', default_options: ['rust_std=2021']) + +rust = import('rust') +cargo_ws = rust.workspace() +assert(cargo_ws.packages() == ['answer']) + +answer_pkg = cargo_ws.package() +assert(answer_pkg.all_features() == ['default', 'large']) +assert(answer_pkg.features() == ['default', 'large']) + +l = static_library('answer', 'src/lib.rs', + rust_args: answer_pkg.rust_args(), + rust_dependency_map: answer_pkg.rust_dependency_map()) +dep = declare_dependency(link_with: l) +meson.override_dependency('answer-2-rs', dep) diff --git a/test cases/rust/31 rust.workspace package/subprojects/answer-2.1/src/lib.rs b/test cases/rust/31 rust.workspace package/subprojects/answer-2.1/src/lib.rs new file mode 100644 index 000000000..b7a721b05 --- /dev/null +++ b/test cases/rust/31 rust.workspace package/subprojects/answer-2.1/src/lib.rs @@ -0,0 +1,10 @@ +pub fn answer() -> u8 +{ + 42 +} + +#[cfg(feature = "large")] +pub fn large_answer() -> u64 +{ + 42 +} diff --git a/test cases/rust/31 rust.workspace package/subprojects/hello-1-rs.wrap b/test cases/rust/31 rust.workspace package/subprojects/hello-1-rs.wrap new file mode 100644 index 000000000..25e7751d0 --- /dev/null +++ b/test cases/rust/31 rust.workspace package/subprojects/hello-1-rs.wrap @@ -0,0 +1,3 @@ +[wrap-file] +directory = hello-1.0 +method = cargo diff --git a/test cases/rust/31 rust.workspace package/subprojects/hello-1.0/Cargo.toml b/test cases/rust/31 rust.workspace package/subprojects/hello-1.0/Cargo.toml new file mode 100644 index 000000000..f6ab8eb91 --- /dev/null +++ b/test cases/rust/31 rust.workspace package/subprojects/hello-1.0/Cargo.toml @@ -0,0 +1,10 @@ +[package] +name = "hello" +version = "1.0.0" +edition = "2021" + +[lib] +crate-type = ["lib"] + +[features] +goodbye = [] diff --git a/test cases/rust/31 rust.workspace package/subprojects/hello-1.0/src/lib.rs b/test cases/rust/31 rust.workspace package/subprojects/hello-1.0/src/lib.rs new file mode 100644 index 000000000..47346350b --- /dev/null +++ b/test cases/rust/31 rust.workspace package/subprojects/hello-1.0/src/lib.rs @@ -0,0 +1,10 @@ +pub fn greet() -> &'static str +{ + "hello world" +} + +#[cfg(feature = "goodbye")] +pub fn farewell() -> &'static str +{ + "goodbye" +} diff --git a/test cases/rust/32 rust.workspace workspace/Cargo.lock b/test cases/rust/32 rust.workspace workspace/Cargo.lock new file mode 100644 index 000000000..7a697ea54 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/Cargo.lock @@ -0,0 +1,24 @@ +# This file is automatically @generated by Cargo. +# It is not intended for manual editing. +version = 4 + +[[package]] +name = "answer" +version = "2.1.0" + +[[package]] +name = "hello" +version = "1.0.0" + +[[package]] +name = "more" +version = "0.1.0" + +[[package]] +name = "workspace_test" +version = "0.1.0" +dependencies = [ + "answer", + "hello", + "more", +] diff --git a/test cases/rust/32 rust.workspace workspace/Cargo.toml b/test cases/rust/32 rust.workspace workspace/Cargo.toml new file mode 100644 index 000000000..ae2ae4aa7 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/Cargo.toml @@ -0,0 +1,17 @@ +[workspace] +members = [".", "more"] + +[package] +name = "workspace_test" +version = "0.1.0" +edition = "2021" + +[features] +default = ["feature1", "hello?/goodbye"] +feature1 = ["answer/large", "dep:hello"] +feature2 = [] + +[dependencies] +hello = { version = "1.0", path = "subprojects/hello-1.0", optional = true } +answer = { version = "2.1", path = "subprojects/answer-2.1", optional = true } +more = { path = "more" } diff --git a/test cases/rust/32 rust.workspace workspace/meson.build b/test cases/rust/32 rust.workspace workspace/meson.build new file mode 100644 index 000000000..d54abd89b --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/meson.build @@ -0,0 +1,53 @@ +project('workspace test', 'rust', default_options: ['rust_std=2021']) + +rust = import('rust') +cargo_ws = rust.workspace() + +# Test workspace.packages() method +assert(cargo_ws.packages() == ['answer', 'hello', 'more', 'workspace_test']) + +main_pkg = cargo_ws.package() +assert(main_pkg.name() == 'workspace_test') +assert(main_pkg.version() == '0.1.0') +assert(main_pkg.api() == '0.1') +assert(main_pkg.all_features() == ['answer', 'default', 'feature1', 'feature2']) +assert(main_pkg.features() == ['default', 'feature1']) + +hello_rs = cargo_ws.subproject('hello') +assert(hello_rs.name() == 'hello') +assert(hello_rs.version() == '1.0.0') +assert(hello_rs.api() == '1') +assert(hello_rs.all_features() == ['default', 'goodbye']) +assert(hello_rs.features() == ['default', 'goodbye']) + +subdir('more') + +e = executable('workspace-test', 'src/main.rs', + dependencies: main_pkg.dependencies(), + rust_args: main_pkg.rust_args(), + rust_dependency_map: main_pkg.rust_dependency_map(), +) +test('workspace-test', e) + +answer_rs = cargo_ws.subproject('answer', '2') +assert(answer_rs.name() == 'answer') +assert(answer_rs.version() == '2.1.0') +assert(answer_rs.api() == '2') +assert(answer_rs.all_features() == ['default', 'large']) +assert(answer_rs.features() == ['default', 'large']) + +# failure test cases for package() +testcase expect_error('argument to package() cannot be a subproject') + cargo_ws.package('hello') +endtestcase +testcase expect_error('workspace member "nonexistent" not found') + cargo_ws.package('nonexistent') +endtestcase + +# failure test cases for dependency() +testcase expect_error('package.dependency.*must be one of c, proc-macro, rust.*', how: 're') + hello_rs.dependency(rust_abi: 'something else') +endtestcase +testcase expect_error('Package hello does not support ABI c') + hello_rs.dependency(rust_abi: 'c') +endtestcase diff --git a/test cases/rust/32 rust.workspace workspace/more/Cargo.toml b/test cases/rust/32 rust.workspace workspace/more/Cargo.toml new file mode 100644 index 000000000..1e7e46a5f --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/more/Cargo.toml @@ -0,0 +1,7 @@ +[package] +name = "more" +version = "0.1.0" +edition = "2021" + +[lib] +crate-type = ["rlib"]
\ No newline at end of file diff --git a/test cases/rust/32 rust.workspace workspace/more/meson.build b/test cases/rust/32 rust.workspace workspace/more/meson.build new file mode 100644 index 000000000..4baf09a29 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/more/meson.build @@ -0,0 +1,12 @@ +more_pkg = cargo_ws.package('more') + +assert(more_pkg.name() == 'more') +assert(more_pkg.features() == ['default']) +assert(more_pkg.all_features() == ['default']) + +l = static_library('more', 'src/lib.rs', + rust_args: more_pkg.rust_args(), + rust_dependency_map: more_pkg.rust_dependency_map(), +) +more_dep = declare_dependency(link_with: l) +more_pkg.override_dependency(more_dep) diff --git a/test cases/rust/32 rust.workspace workspace/more/src/lib.rs b/test cases/rust/32 rust.workspace workspace/more/src/lib.rs new file mode 100644 index 000000000..178adaca9 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/more/src/lib.rs @@ -0,0 +1,3 @@ +pub fn do_something() { + println!("Doing something in more crate"); +}
\ No newline at end of file diff --git a/test cases/rust/32 rust.workspace workspace/src/main.rs b/test cases/rust/32 rust.workspace workspace/src/main.rs new file mode 100644 index 000000000..3c8c968e2 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/src/main.rs @@ -0,0 +1,9 @@ +use hello::{farewell, greet}; + +fn main() { + println!("{}", greet()); + println!("{}", farewell()); + println!("{}", answer::answer()); + println!("{}", answer::large_answer()); + more::do_something(); +} diff --git a/test cases/rust/32 rust.workspace workspace/subprojects/answer-2-rs.wrap b/test cases/rust/32 rust.workspace workspace/subprojects/answer-2-rs.wrap new file mode 100644 index 000000000..d16d1f7a8 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/subprojects/answer-2-rs.wrap @@ -0,0 +1,2 @@ +[wrap-file] +directory = answer-2.1 diff --git a/test cases/rust/32 rust.workspace workspace/subprojects/answer-2.1/Cargo.toml b/test cases/rust/32 rust.workspace workspace/subprojects/answer-2.1/Cargo.toml new file mode 100644 index 000000000..b87782264 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/subprojects/answer-2.1/Cargo.toml @@ -0,0 +1,10 @@ +[package] +name = "answer" +version = "2.1.0" +edition = "2021" + +[lib] +crate-type = ["lib"] + +[features] +large = [] diff --git a/test cases/rust/32 rust.workspace workspace/subprojects/answer-2.1/meson.build b/test cases/rust/32 rust.workspace workspace/subprojects/answer-2.1/meson.build new file mode 100644 index 000000000..d78795602 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/subprojects/answer-2.1/meson.build @@ -0,0 +1,15 @@ +project('answer', 'rust', default_options: ['rust_std=2021']) + +rust = import('rust') +cargo_ws = rust.workspace() +assert(cargo_ws.packages() == ['answer']) + +answer_pkg = cargo_ws.package() +assert(answer_pkg.all_features() == ['default', 'large']) +assert(answer_pkg.features() == ['default', 'large']) + +l = static_library('answer', 'src/lib.rs', + rust_args: answer_pkg.rust_args(), + rust_dependency_map: answer_pkg.rust_dependency_map()) +dep = declare_dependency(link_with: l) +meson.override_dependency('answer-2-rs', dep) diff --git a/test cases/rust/32 rust.workspace workspace/subprojects/answer-2.1/src/lib.rs b/test cases/rust/32 rust.workspace workspace/subprojects/answer-2.1/src/lib.rs new file mode 100644 index 000000000..b7a721b05 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/subprojects/answer-2.1/src/lib.rs @@ -0,0 +1,10 @@ +pub fn answer() -> u8 +{ + 42 +} + +#[cfg(feature = "large")] +pub fn large_answer() -> u64 +{ + 42 +} diff --git a/test cases/rust/32 rust.workspace workspace/subprojects/hello-1-rs.wrap b/test cases/rust/32 rust.workspace workspace/subprojects/hello-1-rs.wrap new file mode 100644 index 000000000..25e7751d0 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/subprojects/hello-1-rs.wrap @@ -0,0 +1,3 @@ +[wrap-file] +directory = hello-1.0 +method = cargo diff --git a/test cases/rust/32 rust.workspace workspace/subprojects/hello-1.0/Cargo.toml b/test cases/rust/32 rust.workspace workspace/subprojects/hello-1.0/Cargo.toml new file mode 100644 index 000000000..f6ab8eb91 --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/subprojects/hello-1.0/Cargo.toml @@ -0,0 +1,10 @@ +[package] +name = "hello" +version = "1.0.0" +edition = "2021" + +[lib] +crate-type = ["lib"] + +[features] +goodbye = [] diff --git a/test cases/rust/32 rust.workspace workspace/subprojects/hello-1.0/src/lib.rs b/test cases/rust/32 rust.workspace workspace/subprojects/hello-1.0/src/lib.rs new file mode 100644 index 000000000..47346350b --- /dev/null +++ b/test cases/rust/32 rust.workspace workspace/subprojects/hello-1.0/src/lib.rs @@ -0,0 +1,10 @@ +pub fn greet() -> &'static str +{ + "hello world" +} + +#[cfg(feature = "goodbye")] +pub fn farewell() -> &'static str +{ + "goodbye" +} diff --git a/unittests/allplatformstests.py b/unittests/allplatformstests.py index 1304658da..0959f22a3 100644 --- a/unittests/allplatformstests.py +++ b/unittests/allplatformstests.py @@ -2156,7 +2156,7 @@ class AllPlatformTests(BasePlatformTests): # Find foo dependency os.environ['PKG_CONFIG_LIBDIR'] = self.privatedir env = get_fake_env(testdir, self.builddir, self.prefix) - kwargs = {'required': True, 'silent': True} + kwargs = {'required': True, 'silent': True, 'native': MachineChoice.HOST} foo_dep = PkgConfigDependency('libanswer', env, kwargs) # Ensure link_args are properly quoted libdir = PurePath(prefix) / PurePath(libdir) diff --git a/unittests/cargotests.py b/unittests/cargotests.py index 643ceceb4..e8eace74f 100644 --- a/unittests/cargotests.py +++ b/unittests/cargotests.py @@ -428,7 +428,7 @@ class CargoTomlTest(unittest.TestCase): self.assertEqual(manifest.lints[2].name, 'unexpected_cfgs') self.assertEqual(manifest.lints[2].level, 'deny') self.assertEqual(manifest.lints[2].priority, 0) - self.assertEqual(manifest.lints[2].check_cfg, ['cfg(test)', 'cfg(MESON)']) + self.assertEqual(manifest.lints[2].check_cfg, ['cfg(MESON)']) def test_cargo_toml_lints_to_args(self) -> None: with tempfile.TemporaryDirectory() as tmpdir: @@ -444,8 +444,7 @@ class CargoTomlTest(unittest.TestCase): self.assertEqual(manifest.lints[1].to_arguments(True), ['-A', 'unknown_lints']) self.assertEqual(manifest.lints[2].to_arguments(False), ['-D', 'unexpected_cfgs']) self.assertEqual(manifest.lints[2].to_arguments(True), - ['-D', 'unexpected_cfgs', '--check-cfg', 'cfg(test)', - '--check-cfg', 'cfg(MESON)']) + ['-D', 'unexpected_cfgs', '--check-cfg', 'cfg(MESON)']) def test_cargo_toml_dependencies(self) -> None: with tempfile.TemporaryDirectory() as tmpdir: diff --git a/unittests/internaltests.py b/unittests/internaltests.py index 74b36a83e..27de7afb1 100644 --- a/unittests/internaltests.py +++ b/unittests/internaltests.py @@ -667,7 +667,7 @@ class InternalTests(unittest.TestCase): with mock.patch.object(PkgConfigInterface, 'instance') as instance_method: instance_method.return_value = FakeInstance(env, MachineChoice.HOST, silent=True) - kwargs = {'required': True, 'silent': True} + kwargs = {'required': True, 'silent': True, 'native': MachineChoice.HOST} foo_dep = PkgConfigDependency('foo', env, kwargs) self.assertEqual(foo_dep.get_link_args(), [(p1 / 'libfoo.a').as_posix(), (p2 / 'libbar.a').as_posix()]) @@ -1038,14 +1038,14 @@ class InternalTests(unittest.TestCase): 'test_dep', methods=[b.DependencyMethods.PKGCONFIG, b.DependencyMethods.CMAKE] ) - actual = [m() for m in f(env, MachineChoice.HOST, {'required': False})] + actual = [m() for m in f(env, {'required': False, 'native': MachineChoice.HOST})] self.assertListEqual([m.type_name for m in actual], ['pkgconfig', 'cmake']) f = F.DependencyFactory( 'test_dep', methods=[b.DependencyMethods.CMAKE, b.DependencyMethods.PKGCONFIG] ) - actual = [m() for m in f(env, MachineChoice.HOST, {'required': False})] + actual = [m() for m in f(env, {'required': False, 'native': MachineChoice.HOST})] self.assertListEqual([m.type_name for m in actual], ['cmake', 'pkgconfig']) def test_validate_json(self) -> None: diff --git a/unittests/linuxliketests.py b/unittests/linuxliketests.py index 4f775d651..e879ac3dd 100644 --- a/unittests/linuxliketests.py +++ b/unittests/linuxliketests.py @@ -149,7 +149,7 @@ class LinuxlikeTests(BasePlatformTests): testdir = os.path.join(self.common_test_dir, '44 pkgconfig-gen') self.init(testdir) env = get_fake_env(testdir, self.builddir, self.prefix) - kwargs = {'required': True, 'silent': True} + kwargs = {'required': True, 'silent': True, 'native': MachineChoice.HOST} os.environ['PKG_CONFIG_LIBDIR'] = self.privatedir foo_dep = PkgConfigDependency('libfoo', env, kwargs) self.assertTrue(foo_dep.found()) @@ -1145,7 +1145,7 @@ class LinuxlikeTests(BasePlatformTests): env = get_fake_env(testdir, self.builddir, self.prefix) env.coredata.optstore.set_option(OptionKey('pkg_config_path'), pkg_dir) - kwargs = {'required': True, 'silent': True} + kwargs = {'required': True, 'silent': True, 'native': MachineChoice.HOST} relative_path_dep = PkgConfigDependency('librelativepath', env, kwargs) self.assertTrue(relative_path_dep.found()) diff --git a/unittests/optiontests.py b/unittests/optiontests.py index a3a2f54e9..ea3d193e0 100644 --- a/unittests/optiontests.py +++ b/unittests/optiontests.py @@ -495,3 +495,44 @@ class OptionTests(unittest.TestCase): child_key = OptionKey(name, subproject_name) optstore.add_project_option(child_key, child_option) self.assertTrue(optstore.options[child_key].yielding) + + def test_machine_canonicalization_cross(self): + """Test that BUILD machine options are handled correctly in cross compilation.""" + optstore = OptionStore(True) + + # Test that BUILD machine per-machine option is NOT canonicalized to HOST + host_pkg_config = OptionKey('pkg_config_path', machine=MachineChoice.HOST) + build_pkg_config = OptionKey('pkg_config_path', machine=MachineChoice.BUILD) + host_option_obj = UserStringArrayOption('pkg_config_path', 'Host pkg-config paths', ['/mingw/lib64/pkgconfig']) + build_option_obj = UserStringArrayOption('pkg_config_path', 'Build pkg-config paths', ['/usr/lib64/pkgconfig']) + optstore.add_system_option(host_pkg_config, host_option_obj) + optstore.add_system_option(build_pkg_config, build_option_obj) + option, value = optstore.get_option_and_value_for(build_pkg_config) + self.assertEqual(value, ['/usr/lib64/pkgconfig']) + + # Test that non-per-machine BUILD option IS canonicalized to HOST + build_opt = OptionKey('optimization', machine=MachineChoice.BUILD) + host_opt = OptionKey('optimization', machine=MachineChoice.HOST) + common_option_obj = UserComboOption('optimization', 'Optimization level', '0', + choices=['plain', '0', 'g', '1', '2', '3', 's']) + optstore.add_system_option(host_opt, common_option_obj) + self.assertEqual(optstore.get_value_for(build_opt), '0') + + def test_machine_canonicalization_native(self): + """Test that BUILD machine options are canonicalized to HOST when not cross compiling.""" + optstore = OptionStore(False) + + host_pkg_config = OptionKey('pkg_config_path', machine=MachineChoice.HOST) + build_pkg_config = OptionKey('pkg_config_path', machine=MachineChoice.BUILD) + host_option_obj = UserStringArrayOption('pkg_config_path', 'Host pkg-config paths', ['/mingw/lib64/pkgconfig']) + build_option_obj = UserStringArrayOption('pkg_config_path', 'Build pkg-config paths', ['/usr/lib64/pkgconfig']) + + # Add per-machine option for HOST only (BUILD will be canonicalized) + optstore.add_system_option(host_pkg_config, host_option_obj) + option, value = optstore.get_option_and_value_for(build_pkg_config) + self.assertEqual(value, ['/mingw/lib64/pkgconfig']) + + # Try again adding build option too, for completeness + optstore.add_system_option(build_pkg_config, build_option_obj) + option, value = optstore.get_option_and_value_for(build_pkg_config) + self.assertEqual(value, ['/mingw/lib64/pkgconfig']) |
