eko.io package

Input/output interfaces, including (de)serialization.

Submodules

eko.io.access module

Manage file system resources access.

exception eko.io.access.ReadOnlyOperator[source]

Bases: RuntimeError, OutputError

It is not possible to write on a read-only operator.

In particular, the behavior would be deceitful, since writing is possible in-memory and even on the temporary folder. But eventually, no writing will happen on a persistent archive, so any modification is lost after exiting the program.

exception eko.io.access.ClosedOperator[source]

Bases: RuntimeError, OutputError

It is not possible to write on nor to read from a closed operator.

This is milder issue than ReadOnlyOperator, since in this case not even the writing on the temporary folder would be possible.

Instead, it will look like you can access some properties, but the operator is actually closed, so it should not be used any longer in general. However, for extremely simple properties, like those available in memory from eko.io.struct.Metadata or eko.io.struct.AccessConfigs, there is no need to raise on read, since those properties are actually available, but they should always raise on writing, since there is no persistence for the content written, and it can be deceitful.

Still, the level of protection will be mild, since a thoruough protection would clutter a lot the code, requiring a lot of maintenance. “We are adult here”.

class eko.io.access.AccessConfigs(path: Path, readonly: bool, open: bool)[source]

Bases: object

Configurations specified during opening of an EKO.

path: Path

The path to the permanent object.

readonly: bool

Read-only flag

open: bool

EKO status

property read

Check reading permission.

Reading access is always granted on open operator.

property write

Check writing permission.

assert_open()[source]

Assert operator is open.

Raises:

exceptions.ClosedOperator – if operator is closed

assert_writeable(msg: str | None = None)[source]

Assert operator is writeable.

Raises:
  • exceptions.ClosedOperator – see assert_open()

  • exceptions.ReadOnlyOperator – if operators has been declared read-only

eko.io.bases module

Operators bases.

class eko.io.bases.Bases(xgrid: XGrid, _targetgrid: XGrid | None = None, _inputgrid: XGrid | None = None, _targetpids: ndarray[Any, dtype[_ScalarType_co]] | None = None, _inputpids: ndarray[Any, dtype[_ScalarType_co]] | None = None)[source]

Bases: DictLike

Rotations related configurations.

Here “Rotation” is intended in a broad sense: it includes both rotations in flavor space (labeled with suffix pids) and in \(x\)-space (labeled with suffix grid). Rotations in \(x\)-space correspond to reinterpolate the result on a different basis of polynomials.

xgrid: XGrid

Internal momentum fraction grid.

property pids

Internal flavor basis, used for computation.

property inputpids: ndarray[Any, dtype[_ScalarType_co]]

Provide pids expected on the input PDF.

property targetpids: ndarray[Any, dtype[_ScalarType_co]]

Provide pids corresponding to the output PDF.

property inputgrid: XGrid

Provide \(x\)-grid expected on the input PDF.

property targetgrid: XGrid

Provide \(x\)-grid corresponding to the output PDF.

classmethod from_dict(dictionary: dict)[source]

Deserialize rotation.

Load from full state, but with public names.

property raw

Serialize rotation.

Pass through interfaces, access internal values but with a public name.

eko.io.dictlike module

Abstraction for serializations.

A few known types are directly registered here, in order to be transparently codified in more native structures.

class eko.io.dictlike.DictLike[source]

Bases: object

Dictionary compatibility base class, for dataclasses.

This class add compatibility to import and export from Python dict, in such a way to support serialization interfaces working with them.

Some collections and scalar objects are normalized to native Python structures, in order to simplify the on-disk representation.

classmethod from_dict(dictionary)[source]

Deserialize, overwritable interface.

The default implementation is just DictLike._from_dict(), but it can be safely overwritten (usually transforming the input before a call to DictLike._from_dict() itself).

property raw

Overwritable serialization.

The default implementation is just DictLike._raw(), but it can be safely overwritten (usually starting from DictLike._raw() itself).

property public_raw

Serialize only public attributes.

eko.io.dictlike.load_field(type_, value)[source]

Deserialize dataclass field.

eko.io.dictlike.load_enum(type_, value)[source]

Deserialize enum variant.

Accepts both the name and value of variants, attempted in this order.

Raises:

ValueError – if value is not the name nor the value of any enum variant

eko.io.dictlike.load_typing(type_, value)[source]

Deserialize type hint associated field.

eko.io.dictlike.raw_field(value)[source]

Serialize DictLike field.

eko.io.exceptions module

IO generic exceptions.

exception eko.io.exceptions.OutputError[source]

Bases: Exception

Generic Output Error.

exception eko.io.exceptions.OutputExistsError[source]

Bases: FileExistsError, OutputError

Output file already exists.

exception eko.io.exceptions.OutputNotTar[source]

Bases: ValueError, OutputError

Specified file is not a .tar archive.

exception eko.io.exceptions.OperatorLoadingError[source]

Bases: ValueError, OutputError

Issue encountered while loading an operator.

exception eko.io.exceptions.OperatorLocationError(path: PathLike)[source]

Bases: ValueError, OutputError

Path supposed to store an operator in wrong location.

eko.io.inventory module

Manage assets used during computation.

exception eko.io.inventory.LookupError[source]

Bases: ValueError

Failure in content retrieval from inventory.

eko.io.inventory.encode(header: Header)[source]

Extract an hash from a header.

eko.io.inventory.header_name(header: Header)[source]

Determine header file name.

eko.io.inventory.operator_name(header: Header, err: bool)[source]

Determine operator file name, from the associated header.

class eko.io.inventory.Inventory(path: ~pathlib.Path, access: ~eko.io.access.AccessConfigs, header_type: ~typing.Type[~eko.io.items.Header], cache: ~typing.Dict[~eko.io.inventory.H, ~eko.io.items.Operator | None] = <factory>, contentless: bool = False, name: str | None = None)[source]

Bases: Generic[H]

Assets manager.

In particular, manage autosave, autoload, and memory caching.

path: Path
access: AccessConfigs
header_type: Type[Header]
cache: Dict[H, Operator | None]
contentless: bool = False
name: str | None = None

Only for logging purpose.

lookup(stem: str, header: bool = False) Path[source]

Look up for content path in inventory.

sync()[source]

Sync the headers in the cache with the content on disk.

In particular, headers on disk that are missing in the cache are added to it, without loading actual operators in memory.

Despite the name, the operation is non-destructive, so, even if cache has been abused, nothing will be deleted nor unloaded.

empty()[source]

Empty the in-memory cache.

eko.io.items module

Inventory items definition.

class eko.io.items.Header[source]

Bases: object

Item header, containing metadata.

class eko.io.items.Evolution(origin: float, target: float, nf: int, cliff: bool = False)[source]

Bases: Header

Information to compute an evolution operator.

It describes the evolution with a fixed number of light flavors between two scales.

origin: float

Starting point.

target: float

Final point.

nf: int

Number of active flavors.

cliff: bool = False

Whether the operator is reaching a matching scale.

Cliff operators are the only ones allowed to be intermediate, even though they can also be final segments of an evolution path (see eko.matchings.Atlas.path()).

Intermediate ones always have final scales mu2 corresponding to matching scales, and initial scales mu20 corresponding to either matching scales or the global initial scale of the EKO.

Note

The name of cliff operators stems from the following diagram:

nf = 3 --------------------------------------------------------
                |
nf = 4 --------------------------------------------------------
                        |
nf = 5 --------------------------------------------------------
                                                    |
nf = 6 --------------------------------------------------------

where each lane corresponds to DGLAP evolution with the relative number of running flavors, and the vertical bridges are the perturbative matchings between two different “adjacent” schemes.

classmethod from_atlas(segment: Segment, cliff: bool = False)[source]

Create instance from analogous eko.matchings.Atlas object.

property as_atlas: Segment

The associated segment.

class eko.io.items.Matching(scale: float, hq: int, inverse: bool)[source]

Bases: Header

Information to compute a matching operator.

Describe the matching between two different flavor number schemes.

scale: float
hq: int
inverse: bool
classmethod from_atlas(matching: Matching)[source]

Create instance from analogous eko.matchings.Atlas object.

property as_atlas: Matching

The associated segment.

class eko.io.items.Target(scale: float, nf: int)[source]

Bases: Header

Target evolution point, labeling evolution from origin to there.

scale: float
nf: int
classmethod from_ep(ep: Tuple[float, int])[source]

Create instance from the EPoint analogue.

property ep: Tuple[float, int]

Cast to EPoint.

class eko.io.items.Operator(operator: ndarray[Any, dtype[_ScalarType_co]], error: ndarray[Any, dtype[_ScalarType_co]] | None = None)[source]

Bases: object

Operator representation.

To be used to hold the result of a computed 4-dim operator (either a raw evolution operator or a matching condition).

Note

IO works with streams in memory, in order to avoid intermediate write on disk (keep read from and write to tar file only).

operator: ndarray[Any, dtype[_ScalarType_co]]

Content of the evolution operator.

error: ndarray[Any, dtype[_ScalarType_co]] | None = None

Errors on individual operator elements (mainly used for integration error, but it can host any kind of error).

save(stream: BinaryIO) bool[source]

Save content of operator to bytes.

The content is saved on a stream, in order to be able to perform the operation both on disk and in memory.

The returned value tells whether the operator saved contained or not the error (this control even the format, npz with errors, npy otherwise).

classmethod load(stream: BinaryIO)[source]

Load operator from bytes.

An input stream is used to load the operator from, in order to support the operation both on disk and in memory.

class eko.io.items.Item(header: Header, content: Operator)[source]

Bases: object

Inventory item.

header: Header
content: Operator

eko.io.legacy module

Support legacy storage formats.

eko.io.legacy.load_tar(source: PathLike, dest: PathLike, errors: bool = False)[source]

Load tar representation from file.

Compliant with dump_tar() output.

Parameters:
  • source – source tar name

  • dest – dest tar name

  • errors – whether to load also errors (default False)

class eko.io.legacy.PseudoTheory(heavy: HeavyInfo)[source]

Bases: DictLike

Fake theory, mocking eko.io.runcards.TheoryCard.

Used to provide a theory for the EKO builder, even when the theory information is not available.

heavy: HeavyInfo
classmethod from_old(old: Dict[str, Any])[source]

Load from old metadata.

class eko.io.legacy.PseudoOperator(mu20: float, evolgrid: List[Tuple[float, int]], xgrid: XGrid, configs: dict)[source]

Bases: DictLike

Fake operator, mocking eko.io.runcards.OperatorCard.

Used to provide a theory for the EKO builder, even when the operator information is not fully available.

mu20: float
evolgrid: List[Tuple[float, int]]
xgrid: XGrid
configs: dict
classmethod from_old(old: Dict[str, Any])[source]

Load from old metadata.

eko.io.legacy.ARRAY_SUFFIX = '.npy.lz4'

Suffix for array files inside the tarball.

eko.io.legacy.load_arrays(dir: Path) dict[source]

Load arrays from compressed dumps.

eko.io.legacy.OPERATOR = 'operator'

File name stem for operators.

eko.io.legacy.ERROR = 'operator_error'

File name stem for operator errrors.

eko.io.legacy.op5to4(evolgrid: List[Tuple[float, int]], arrays: dict) Dict[Tuple[float, int], Operator][source]

Load dictionary of 4-dim operators, from a single 5-dim one.

eko.io.manipulate module

Manipulate output generate by EKO.

eko.io.manipulate.SIMGRID_ROTATION = 'ij,ajbk,kl->aibl'

Simultaneous grid rotation contraction indices.

eko.io.manipulate.rotation(new: XGrid | ndarray[Any, dtype[_ScalarType_co]] | None, old: XGrid | ndarray[Any, dtype[_ScalarType_co]], check: Callable, compute: Callable)[source]

Define grid rotation.

This function returns the new grid to be assigned and the rotation computed, if the checks for a non-trivial new grid are passed.

However, the check and the computation are delegated respectively to the callables check and compute.

eko.io.manipulate.xgrid_check(new: XGrid | None, old: XGrid)[source]

Check validity of new xgrid.

eko.io.manipulate.xgrid_compute_rotation(new: XGrid, old: XGrid, interpdeg: int, swap: bool = False)[source]

Compute rotation from old to new xgrid.

By default, the roation is computed for a target xgrid. Whether the function should be used for an input xgrid, the swap argument should be set to True, in order to compute it in the other direction (i.e. the transposed).

eko.io.manipulate.xgrid_reshape(eko: EKO, targetgrid: XGrid | None = None, inputgrid: XGrid | None = None)[source]

Reinterpolate operators on output and/or input grids.

Target corresponds to the output PDF.

The operation is in-place.

eko.io.manipulate.SIMPIDS_ROTATION = 'ca,ajbk,bd->cjdk'

Simultaneous grid rotation contraction indices.

eko.io.manipulate.flavor_reshape(eko: EKO, targetpids: ndarray[Any, dtype[_ScalarType_co]] | None = None, inputpids: ndarray[Any, dtype[_ScalarType_co]] | None = None, update: bool = True)[source]

Change the operators to have in the output targetpids and/or in the input inputpids.

The operation is in-place.

Parameters:
  • eko – the operator to be rotated

  • targetpids – target rotation specified in the flavor basis

  • inputpids – input rotation specified in the flavor basis

  • update – update EKO metadata after writing

eko.io.manipulate.to_evol(eko: EKO, source: bool = True, target: bool = False)[source]

Rotate the operator into evolution basis.

This assigns also the pids. The operation is in-place.

Parameters:
  • eko – the operator to be rotated

  • source – rotate on the input tensor

  • target – rotate on the output tensor

eko.io.manipulate.to_uni_evol(eko: EKO, source: bool = True, target: bool = False)[source]

Rotate the operator into evolution basis.

This assigns also the pids. The operation is in-place.

Parameters:
  • eko – the operator to be rotated

  • source – rotate on the input tensor

  • target – rotate on the output tensor

eko.io.metadata module

Define eko.EKO metadata.

class eko.io.metadata.Metadata(origin: Tuple[float, int], bases: Bases, _path: Path | None = None, version: str = '0.0.0', data_version: int = 1)[source]

Bases: DictLike

Manage metadata, and keep them synced on disk.

It is possible to have a metadata view, in which the path is not actually connected (i.e. it is set to None). In this case, no update will be possible, of course.

Note

Unfortunately, for nested structures it is not possible to detect a change in their attributes, so a call to update() has to be performed manually.

origin: Tuple[float, int]

Inital scale.

bases: Bases

Manipulation information, describing the current status of the EKO (e.g. inputgrid and targetgrid).

version: str = '0.0.0'

Library version used to create the corresponding file.

data_version: int = 1

Specs version, to which the file adheres.

classmethod load(path: PathLike)[source]

Load metadata from open folder.

Parameters:

path (os.PathLike) – the path to the open EKO folder

Returns:

loaded metadata

Return type:

bool

update()[source]

Update the disk copy of metadata.

property path

Access temporary dir path.

Raises:

RuntimeError – if path has not been initialized before

property raw

Override default DictLike.raw() representation to exclude path.

eko.io.paths module

Define paths inside an eko.EKO object.

class eko.io.paths.InternalPaths(root: Path)[source]

Bases: object

Paths inside an EKO folder.

This structure exists to locate in a single place the internal structure of an EKO folder.

The only value required is the root path, everything else is computed relative to this root. In case only the relative paths are required, just create this structure with root equal to emtpty string or ".".

root: Path

The root of the EKO folder (use placeholder if not relevant)

property metadata

Metadata file.

property recipes

Recipes folder.

property recipes_matching

Matching recipes folder.

property parts

Parts folder.

property parts_matching

Matching parts folder.

property operators

Operators folder.

This is the one containing the actual EKO components, after computation has been performed.

property theory_card

Theory card dump.

property operator_card

Operator card dump.

bootstrap(theory: dict, operator: dict, metadata: dict)[source]

Create directory structure.

eko.io.raw module

Utilities to manipulate unstructured IO.

The content is treated independently on the particular data content, but as generic uknown data in an abstract file format, e.g. a tar archive or YAML data file, as opposed to structured YAML representing a specific runcard.

eko.io.raw.is_within_directory(directory: PathLike, target: PathLike) bool[source]

Check if target path is contained in directory.

Thanks to TrellixVulnTeam for the idea.

Parameters:
  • directory – the directory where the target is supposed to be contained

  • target – the target file to check

eko.io.raw.safe_extractall(tar: TarFile, path: PathLike | None = None, members: Sequence[TarInfo] | None = None, *, numeric_owner: bool = False)[source]

Extract a tar archive avoiding CVE-2007-4559 issue.

Thanks to TrellixVulnTeam for the contribution.

All undocumented parameters have the same meaning of the analogue ones in TarFile.extractall().

Parameters:
  • tar – the tar archive object to be extracted

  • path – the path to extract to, if not specified the current directory is used

eko.io.runcards module

Structures to hold runcards information.

All energy scales in the runcards should be saved linearly, not the squared value, for consistency. Squares are consistenly taken inside.

class eko.io.runcards.TheoryCard(order: Tuple[int, int], couplings: CouplingsInfo, heavy: HeavyInfo, xif: float, n3lo_ad_variation: Tuple[int, int, int, int], use_fhmruvv: bool | None = False, matching_order: Tuple[int, int] | None = None)[source]

Bases: DictLike

Represent theory card content.

order: Tuple[int, int]

Perturbative order tuple, (QCD, QED).

couplings: CouplingsInfo

Couplings configuration.

heavy: HeavyInfo

Heavy quarks related information.

xif: float

Ratio between factorization scale and process scale.

n3lo_ad_variation: Tuple[int, int, int, int]

(gg, gq, qg, qq, nsp, nsm, nsv).

Type:

N3LO anomalous dimension variation

use_fhmruvv: bool | None = False

If True use the FHMRUVV N3LO anomalous dimensions

matching_order: Tuple[int, int] | None = None

Matching conditions perturbative order tuple, (QCD, QED).

class eko.io.runcards.Debug(skip_singlet: bool = False, skip_non_singlet: bool = False)[source]

Bases: DictLike

Debug configurations.

skip_singlet: bool = False

Whether to skip QCD singlet computation.

skip_non_singlet: bool = False

Whether to skip QCD non-singlet computation.

class eko.io.runcards.Configs(evolution_method: EvolutionMethod, ev_op_max_order: Tuple[int, int], ev_op_iterations: int, scvar_method: ScaleVariationsMethod | None, inversion_method: InversionMethod | None, interpolation_polynomial_degree: int, interpolation_is_log: bool, polarized: bool, time_like: bool, n_integration_cores: int = 1)[source]

Bases: DictLike

Solution specific configurations.

evolution_method: EvolutionMethod

Evolution mode.

ev_op_max_order: Tuple[int, int]

Maximum order to use in U matrices expansion.

Used only in perturbative solutions.

ev_op_iterations: int

Number of intervals in which to break the global path.

scvar_method: ScaleVariationsMethod | None

Scale variation method.

inversion_method: InversionMethod | None

Which method to use for backward matching conditions.

interpolation_polynomial_degree: int

Degree of elements of the intepolation polynomial basis.

interpolation_is_log: bool

Whether to use polynomials in \(\log(x)\).

If false, polynomials are in \(x\).

polarized: bool

If true do polarized evolution.

time_like: bool

If true do time-like evolution.

n_integration_cores: int = 1

Number of cores used to parallelize integration.

class eko.io.runcards.OperatorCard(mu0: float, mugrid: List[Tuple[float, int]], xgrid: XGrid, configs: Configs, debug: Debug, eko_version: str = '0.0.0')[source]

Bases: DictLike

Operator Card info.

mu0: float

Initial scale.

mugrid: List[Tuple[float, int]]
xgrid: XGrid

Momentum fraction internal grid.

configs: Configs

Specific configuration to be used during the calculation of these operators.

debug: Debug

Debug configurations.

eko_version: str = '0.0.0'

Version of EKO package first used for generation.

property mu20

Squared value of initial scale.

property mu2grid: ndarray[Any, dtype[_ScalarType_co]]

Grid of squared final scales.

property evolgrid: List[Tuple[float, int]]

Grid of squared final scales.

property pids

Internal flavor basis, used for computation.

class eko.io.runcards.Legacy(theory: Dict[str, Any], operator: Dict[str, Any])[source]

Bases: object

Upgrade legacy runcards.

theory: Dict[str, Any]
operator: Dict[str, Any]
MOD_EV2METHOD = {'EXA': 'iterate-exact', 'EXP': 'iterate-expanded', 'TRN': 'truncated'}
static heavies(pattern: str, old_th: dict)[source]

Retrieve a set of values for all heavy flavors.

static fallback(*args: T, default: T | None = None) T[source]

Return the first not None argument.

property new_theory

Build new format theory runcard.

property new_operator

Build new format operator runcard.

eko.io.runcards.update(theory: Dict[str, Any] | TheoryCard, operator: Dict[str, Any] | OperatorCard)[source]

Update legacy runcards.

This function is mainly defined for compatibility with the old interface. Prefer direct usage of Legacy in new code.

Consecutive applications of this function yield identical results:

cards = update(theory, operator)
assert update(*cards) == cards
eko.io.runcards.default_atlas(masses: list, matching_ratios: list)[source]

Create default landscape.

This method should not be used to write new runcards, but rather to have a consistent default for comparison with other softwares and existing PDF sets. There is no one-to-one relation between number of running flavors and final scales, unless matchings are all applied. But this is a custom choice, since it is possible to have PDFs in different FNS at the same scales.

eko.io.runcards.flavored_mugrid(mugrid: list, masses: list, matching_ratios: list)[source]

Upgrade \(\mu^2\) grid to contain also target number flavors.

It determines the number of flavors for the PDF set at the target scale, inferring it according to the specified scales.

This method should not be used to write new runcards, but rather to have a consistent default for comparison with other softwares and existing PDF sets. There is no one-to-one relation between number of running flavors and final scales, unless matchings are all applied. But this is a custom choice, since it is possible to have PDFs in different FNS at the same scales.

eko.io.runcards.masses(theory: TheoryCard, evmeth: EvolutionMethod) List[float][source]

Compute masses in the chosen scheme.

eko.io.struct module

Define output representation structures.

eko.io.struct.inventories(path: Path, access: AccessConfigs) dict[source]

Set up empty inventories for object initialization.

class eko.io.struct.EKO(recipes: Inventory[Evolution], recipes_matching: Inventory[Matching], parts: Inventory[Evolution], parts_matching: Inventory[Matching], operators: Inventory[Target], metadata: Metadata, access: AccessConfigs)[source]

Bases: object

Operator interface.

This class offers an interface to an abstract operator, between memory and disk.

An actual operator might be arbitrarily huge, and in particular size limitations in memory are far more strict than on disk. Since manually managing, for each application, the burden of off-loading part of the operator might be hard and occasionally not possible (without a clear picture of the internals), the library itself offers this facility.

In particular, the data format on disk has a complete specification, and can hold a full operator independently of the loading procedure. In order to accomplish the former goal, the remaining task of partial loading is done by this class (for the Python library, other implementations are possible and encouraged).

For this reason, a core component of an EKO object is a path, referring to the location on disk of the corresponding operator. Any EKO has an associated path:

  • for the computed object, it corresponds to the path where the actual result of the computation is already saved

  • for a new object, it is the path at which any result of final or intermediate computation is stored, as soon as it is produced

The computation can be stopped at any time, without the loss of any of the intermediate results.

recipes: Inventory[Evolution]
recipes_matching: Inventory[Matching]
parts: Inventory[Evolution]
parts_matching: Inventory[Matching]
operators: Inventory[Target]
metadata: Metadata

Operator metadata.

access: AccessConfigs

Access related configurations.

property paths: InternalPaths

Accessor for internal paths.

property bases: Bases

Bases information.

property xgrid: XGrid

Momentum fraction internal grid.

property mu20: float

Provide squared initial scale.

property mu2grid: List[float]

Provide the list of \(Q^2\) as an array.

property evolgrid: List[Tuple[float, int]]

Provide the list of evolution points as an array.

property theory_card

Provide theory card, retrieving from the dump.

property operator_card

Provide operator card, retrieving from the dump.

update()[source]

Write updates to structure for persistency.

assert_permissions(read=True, write=False)[source]

Assert permissions on current operator.

property permissions

Provide permissions information.

load_recipes(recipes: List[Evolution | Matching])[source]

Load recipes in bulk.

operator(ep: Tuple[float, int])[source]

Retrieve an operator and discard it afterwards.

To be used as a contextmanager: the operator is automatically loaded as usual, but on the closing of the context manager it is dropped from memory.

items()[source]

Iterate operators, with minimal load.

Pay attention, this iterator:

  • is not a read-only operation from the point of view of the in-memory object (since the final result after iteration is no operator loaded)

  • but it is a read-only operation from the point of view of the permanent object on-disk

Yields:

tuple – couples of (q2, operator), loaded immediately before, unloaded immediately after

approx(ep: Tuple[float, int], rtol: float = 1e-06, atol: float = 1e-10) Tuple[float, int] | None[source]

Look for close enough evolution point in the EKO.

The distance is mostly evaluated along the \(\mu^2\) dimension, while \(n_f\) is considered with a discrete distance: if two points have not the same, they are classified as far.

Raises:

ValueError – if multiple values are find in the neighbourhood

unload()[source]

Fully unload the operators in memory.

deepcopy(path: PathLike)[source]

Create a deep copy of current instance.

The managed on-disk object is copied as well, to the new path location. If you don’t want to copy the disk, consider using directly:

copy.deepcopy(myeko)

It will perform the exact same operation, without propagating it to the disk counterpart.

Parameters:

path – path to the permanent location of the new object (not the temporary directory)

Returns:

the copy created

Return type:

EKO

static load(tarpath: PathLike, dest: PathLike)[source]

Load the content of archive in a target directory.

Parameters:
classmethod open(path: PathLike, mode='r')[source]

Open EKO object in the specified mode.

classmethod read(path: PathLike)[source]

Read the content of an EKO.

Type-safe alias for:

EKO.open(... , "r")
classmethod create(path: PathLike)[source]

Create a new EKO.

Type-safe alias for:

EKO.open(... , "w")
classmethod edit(path: PathLike)[source]

Read from and write on existing EKO.

Type-safe alias for:

EKO.open(... , "a")
dump(archive: PathLike | None = None)[source]

Dump the current content to archive.

Parameters:

archive (os.PathLike or None) – path to archive, in general you should keep the default, that will make use of the registered path (default: None)

Raises:

ValueError – when trying to dump on default archive in read-only mode

close()[source]

Close the current object, cleaning up.

If not in read-only mode, dump to permanent storage. Remove the temporary directory used.

property raw: dict

Provide raw representation of the full content.

Returns:

nested dictionary, storing all the values in the structure, but the operators themselves

Return type:

dict

class eko.io.struct.Builder(path: Path, access: AccessConfigs, theory: TheoryCard | None = None, operator: OperatorCard | None = None, eko: EKO | None = None)[source]

Bases: object

Build EKO instances.

path: Path

Path on disk to .

access: AccessConfigs

Access related configurations.

theory: TheoryCard | None = None
operator: OperatorCard | None = None
eko: EKO | None = None
load_cards(theory: TheoryCard, operator: OperatorCard)[source]

Load both theory and operator card.

build() EKO[source]

Build EKO instance.

Returns:

the constructed instance

Return type:

EKO

Raises:

RuntimeError – if not enough information is available (at least one card missing)

eko.io.types module

Common type definitions, only used for static analysis.

class eko.io.types.ReferenceRunning(iterable=(), /)[source]

Bases: list, Generic[T]

Running quantities reference point.

To simplify serialization, the class is just storing the content as a list, but:

  • it is constructed with a Running.typed(T, Scale) signature

  • it should always be used through the property accessors, rather then using the list itself

classmethod typed(value: T, scale: float)[source]

Define constructor from individual values.

This is the preferred constructor for references, since respects the intended types of the values. It is not the default one only to simplify (de)serialization.

property value: T

Reference value, given at a specified scale.

property scale: float

Reference scale, at which the value of the function is given.

eko.io.types.FlavNumRef

alias of ReferenceRunning[int]

eko.io.types.LinearScaleRef

alias of ReferenceRunning[float]

class eko.io.types.EvolutionMethod(value)[source]

Bases: Enum

DGLAP solution method.

ITERATE_EXACT = 'iterate-exact'
ITERATE_EXPANDED = 'iterate-expanded'
PERTURBATIVE_EXACT = 'perturbative-exact'
PERTURBATIVE_EXPANDED = 'perturbative-expanded'
TRUNCATED = 'truncated'
ORDERED_TRUNCATED = 'ordered-truncated'
DECOMPOSE_EXACT = 'decompose-exact'
DECOMPOSE_EXPANDED = 'decompose-expanded'
class eko.io.types.ScaleVariationsMethod(value)[source]

Bases: Enum

Method used to account for factorization scale variation.

EXPONENTIATED = 'exponentiated'
EXPANDED = 'expanded'
class eko.io.types.InversionMethod(value)[source]

Bases: Enum

Method used to invert the perturbative matching conditions.

EXACT = 'exact'
EXPANDED = 'expanded'