cape.cfdx.dataBook: CFD Data book nodule

This module contains functions for reading and processing forces, moments, and other entities from cases in a trajectory. This module forms the core for all database post-processing in Cape, but several other database modules exist for more specific applications:

This module provides three basic classes upon which more specific data classes are developed:

  • DataBook: Overall databook container

  • DBBase: Template databook for an individual component

  • CaseData: Template class for one case’s iterative history

The first two of these are subclassed from dict, so that generic data can be accessed with syntax such as DB[coeff] for an appropriately named coefficient. An outline of derived classes for these three templates is shown below.

  • DataBook
    • DBTriqFM: post-processed forces & moments

  • DBBase
    • DBComp: force & moment data, one comp

    • DBTarget: target data

    • DBTriqFMComp: surface CP FM for one comp

    • DBLineLoad: sectional load databook

    • DBPointSensorGroup: group of points

    • DBTriqPointGroup: group of surface points

    • DBPointSensor: one point sensor

    • DBTriqPoint: one surface point sensor

  • CaseData
    • CaseFM: iterative force & moment history

    • CaseResid: iterative residual history

In addition, each solver has its own version of this module:

The parent class cape.cfdx.dataBook.DataBook provides a common interface to all of the requested force, moment, point sensor, etc. quantities that have been saved in the data book. Informing cape which quantities to track, and how to statistically process them, is done using the "DataBook" section of the JSON file, and the various data book options are handled within the API using the cape.cfdx.options.DataBook module.

The master data book class cape.cfdx.dataBook.DataBook is based on the built-in dict class with keys pointing to force and moment data books for individual components. For example, if the JSON file tells Cape to track the forces and/or moments on a component called "body", and the data book is the variable DB, then the forces and moment data book is DB["body"]. This force and moment data book contains statistically averaged forces and moments and other statistical quantities for every case in the run matrix. The class of the force and moment data book is cape.cfdx.dataBook.DBComp.

The data book also has the capability to store “target” data books so that the user can compare results of the current CFD solutions to previous results or experimental data. These are stored in DB["Targets"] and use the cape.cfdx.dataBook.DBTarget class. Other types of data books can also be created, such as the cape.cfdx.pointSensor.DBPointSensor class for tracking statistical properties at individual points in the solution field. Data books for tracking results of groups of cases are built off of the cape.cfdx.dataBook.DBBase class, which contains many common tools such as plotting.

The cape.cfdx.dataBook module also contains modules for processing results within individual case folders. This includes the cape.cfdx.dataBook.CaseFM module for reading iterative force/moment histories and the cape.cfdx.dataBook.CaseResid for iterative histories of residuals.

Global data book container class

class cape.cfdx.dataBook.DataBook(cntl, RootDir=None, targ=None, **kw)

Interface to the data book for a given CFD run matrix

Call:
>>> DB = cape.cfdx.dataBook.DataBook(cntl, **kw)
Inputs:
cntl: Cntl

CAPE control class instance

RootDir: str

Root directory, defaults to os.getcwd()

targ: {None} | str

Option to read duplicate data book as a target named targ

Outputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

DB.x: cape.runmatrix.RunMatrix

Run matrix of rows saved in the data book

DB[comp]: cape.cfdx.dataBook.DBComp

Component data book for component comp

DB.Components: list[str]

List of force/moment components

DB.Targets: dict

Dictionary of DBTarget target data books

Versions:
  • 2014-12-20 @ddalle: Started

  • 2015-01-10 @ddalle: Version 1.0

  • 2022-03-07 @ddalle: Version 1.1; allow .cntl

DeleteCaseProp(I, comp=None)

Delete list of cases from generic-property databook

Call:
>>> DB.DeleteCaseProp(I)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

I: list[int]

List of trajectory indices

comp: {None} | list | str

Component or list of components

Versions:
  • 2022-04-08 @ddalle: Version 1.0

DeleteCasePropComp(I, comp)

Delete list of cases from generic-property databook comp

Call:
>>> n = DB.DeleteCasePropComp(I, comp)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

I: list[int]

List of trajectory indices

comp: str

Name of component

Outputs:
n: int

Number of deleted entries

Versions:
  • 2022-04-08 @ddalle: Version 1.0

DeleteCases(I, comp=None)

Delete list of cases from data book

Call:
>>> DB.Delete(I)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

I: list[int]

List of trajectory indices

comp: {None} | list | str

Component or list of components

Versions:
  • 2015-03-13 @ddalle: Version 1.0

  • 2017-04-13 @ddalle: Split by component

DeleteCasesComp(I, comp)

Delete list of cases from data book

Call:
>>> n = DB.Delete(I)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

I: list[int]

List of trajectory indices

Outputs:
n: int

Number of deleted entries

Versions:
  • 2015-03-13 @ddalle: Version 1.0

  • 2017-04-13 @ddalle: Split by component

DeleteDBPyFunc(I, comp=None)

Delete list of cases from PyFunc databook

Call:
>>> DB.DeleteDBPyFunc(I)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

I: list[int]

List of trajectory indices

comp: {None} | list | str

Component or list of components

Versions:
  • 2022-04-12 @ddalle: Version 1.0

DeleteDBPyFuncComp(I, comp)

Delete list of cases from PyFunc databook comp

Call:
>>> n = DB.DeleteDBPyFuncComp(I, comp)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

I: list[int]

List of trajectory indices

comp: str

Name of component

Outputs:
n: int

Number of deleted entries

Versions:
  • 2022-04-12 @ddalle: Version 1.0

DeleteLineLoad(I, comp=None)

Delete list of cases from LineLoad component data books

Call:
>>> DB.DeleteLineLoad(I, comp=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

I: list[int]

List of trajectory indices

comp: {None} | str | list

Component wild card or list of component wild cards

Versions:
  • 2017-04-25 @ddalle: Version 1.0

DeleteLineLoadComp(comp, I=None)

Delete list of cases from a LineLoad component data book

Call:
>>> n = DB.DeleteLineLoadComp(comp, I=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

comp: str

Name of component

I: list[int]

List of trajectory indices

Outputs:
n: list

Number of deletions made

Versions:
  • 2017-04-25 @ddalle: Version 1.0

DeleteTriqFM(I, comp=None)

Delete list of cases from TriqFM component data books

Call:
>>> DB.DeleteTriqFM(I, comp=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

I: {None} | list[int]

List or array of run matrix indices

comp: {None} | str | list

Component wild card or list of component wild cards

Versions:
  • 2017-04-25 @ddalle: Version 1.0

DeleteTriqFMComp(comp, I=None)

Delete list of cases from a TriqFM component data book

Call:
>>> n = DB.DeleteTriqFMComp(comp, I=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

comp: str

Name of component

I: {None} | list[int]

List or array of run matrix indices

Outputs:
n: list

Number of deletions made

Versions:
  • 2017-04-25 @ddalle: Version 1.0

DeleteTriqPoint(I, comp=None)

Delete list of cases from TriqPoint component data books

Call:
>>> DB.DeleteTriqPoint(I, comp=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

I: {None} | list[int]

List or array of run matrix indices

comp: {None} | str | list

Component wild card or list of component wild cards

Versions:
  • 2017-10-11 @ddalle: Version 1.0

DeleteTriqPointComp(comp, I=None)

Delete list of cases from a TriqPoint component data book

Call:
>>> n = DB.DeleteTriqPointComp(comp, I=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

comp: str

Name of component

I: {None} | list[int]

List or array of run matrix indices

Outputs:
n: list

Number of deletions made

Versions:
FindMatch(i)

Find an entry by run matrix (trajectory) variables

It is assumed that exact matches can be found.

Call:
>>> j = DB.FindMatch(i)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

i: int

Index of the case from the trajectory to try match

Outputs:
j: numpy.ndarray[int]

Array of index(es) that match case i or NaN

Versions:
  • 2016-02-27 @ddalle: Added as a pointer to first component

FindTargetMatch(DBT, i, topts, keylist='tol', **kw)

Find a target entry by run matrix (trajectory) variables

Cases will be considered matches by comparing variables specified in the topts variable, which shares some of the options from the "Targets" subsection of the "DataBook" section of cape.json. Suppose that topts contains the following:

{
    "RunMatrix": {"alpha": "ALPHA", "Mach": "MACH"}
    "Tolerances": {
        "alpha": 0.05,
        "Mach": 0.01
    },
    "Keys": ["alpha", "Mach", "beta"]
}

Then any entry in the data book target that matches the Mach number within 0.01 (using a column labeled "MACH") and alpha to within 0.05 is considered a match. Because the Keys parameter contains "beta", the search will also look for exact matches in "beta".

If the Keys parameter is not set, the search will use either all the keys in the trajectory, x.cols, or just the keys specified in the "Tolerances" section of topts. Which of these two default lists to use is determined by the keylist input.

Call:
>>> j = DB.FindTargetMatch(DBT, i, topts, **kw)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

DBT: DBBase | DBTarget

Target component databook

i: int

Index of the case from the trajectory to try match

topts: dict | DBTarget

Criteria used to determine a match

keylist: "x" | {"tol"}

Source for default list of keys

source: {"self"} | "target"

Match DB case i or DBT case i

Outputs:
j: numpy.ndarray[int]

Array of indices that match the trajectory

See also:
Versions:
  • 2016-02-27 @ddalle: Added as a pointer to first component

  • 2018-02-12 @ddalle: First input x -> DBT

GetCurrentIter()

Determine iteration number of current folder

Call:
>>> n = DB.GetCurrentIter()
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

Outputs:
n: int | None

Iteration number

Versions:
  • 2017-04-13 @ddalle: First separate version

GetDBMatch(j, ftarg, tol=0.0, tols={})

Get index of a target match (if any) for one data book entry

Call:
>>> i = DB.GetDBMatch(j, ftarg, tol=0.0, tols={})
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of a data book class

j: int | np.nan

Data book target index

ftarg: str

Name of the target and column

tol: float

Tolerance for matching all keys (0.0 enforces equality)

tols: dict

Dictionary of specific tolerances for each key

Outputs:
i: int

Data book index

Versions:
  • 2015-08-30 @ddalle: Version 1.0

GetRefComponent()

Get first component with type ‘FM’, ‘Force’, or ‘Moment’

Call:
>>> DBc = DB.GetRefComponent()
Inputs:
DB: cape.cfdx.dataBook.DataBook

Data book instance

Outputs:
DBc: cape.cfdx.dataBook.DBComp

Data book for one component

Versions:
  • 2016-08-18 @ddalle: Version 1.0

GetTargetByName(targ)

Get a target handle by name of the target

Call:
>>> DBT = DB.GetTargetByName(targ)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

targ: str

Name of target to find

Outputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the pyCart data book target class

Versions:
  • 2015-06-04 @ddalle: Version 1.0

GetTargetMatch(i, ftarg, tol=0.0, tols={})

Get index of a target match for one data book entry

Call:
>>> j = DB.GetTargetMatch(i, ftarg, tol=0.0, tols={})
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

i: int

Data book index

ftarg: str

Name of the target and column

tol: float

Tolerance for matching all keys

tols: dict

Dictionary of specific tolerances for each key

Outputs:
j: int | np.nan

Data book target index

Versions:
  • 2015-08-30 @ddalle: Version 1.0

GetTargetMatches(ftarg, tol=0.0, tols={})

Get vectors of indices matching targets

Call:
>>> I, J = DB.GetTargetMatches(ftarg, tol=0.0, tols={})
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

ftarg: str

Name of the target and column

tol: float

Tolerance for matching all keys

tols: dict

Dictionary of specific tolerances for each key

Outputs:
I: np.ndarray

Array of data book indices with matches

J: np.ndarray

Array of target indices for each data book index

Versions:
  • 2015-08-30 @ddalle: Version 1.0

MatchRunMatrix()

Restrict the data book object to points in the trajectory

Call:
>>> DB.MatchRunMatrix()
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

Versions:
  • 2015-05-28 @ddalle: Version 1.0

PlotCoeff(comp, coeff, I, **kw)

Plot a sweep of one coefficients over several cases

Call:
>>> h = DB.PlotCoeff(comp, coeff, I, **kw)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

comp: str

Component whose coefficient is being plotted

coeff: str

Coefficient being plotted

I: np.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
x: [ {None} | str ]

RunMatrix key for x axis (else plot against index)

Label: {comp} | str

Manually specified label

Legend: {True} | False

Whether or not to use a legend

StDev: {None} | float

Multiple of iterative history standard deviation to plot

MinMax: True | {False}

Option to plot min and max from iterative history

Uncertainty: True | {False}

Whether to plot direct uncertainty

LineOptions: dict

Plot options for the primary line(s)

StDevOptions: dict

Plot options for the standard deviation plot

MinMaxOptions: dict

Plot options for the min/max plot

UncertaintyOptions: dict

Dictionary of plot options for the uncertainty plot

FigWidth: float

Width of figure in inches

FigHeight: float

Height of figure in inches

PlotTypeStDev: {"FillBetween"} | "ErrorBar"

Plot function to use for standard deviation plot

PlotTypeMinMax: {"FillBetween"} | "ErrorBar"

Plot function to use for min/max plot

PlotTypeUncertainty: "FillBetween" | {"ErrorBar"}

Plot function to use for uncertainty plot

Outputs:
h: dict

Dictionary of plot handles

See also:
Versions:
  • 2015-05-30 @ddalle: Version 1.0

  • 2015-12-14 @ddalle: Added error bars

PlotContour(comp, coeff, I, **kw)

Create a contour plot of one coefficient over several cases

Call:
>>> h = DB.PlotContour(comp, coeff, I, **kw)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

comp: str

Component whose coefficient is being plotted

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
x: str

RunMatrix key for x axis

y: str

RunMatrix key for y axis

ContourType: {“tricontourf”} | “tricontour” | “tripcolor”

Contour plotting function to use

LineType: {“plot”} | “triplot” | “none”

Line plotting function to highlight data points

Label: [ {comp} | str ]

Manually specified label

ColorBar: [ {True} | False ]

Whether or not to use a color bar

ContourOptions: dict

Plot options to pass to contour plotting function

LineOptions: dict

Plot options for the line plot

FigWidth: float

Width of figure in inches

FigHeight: float

Height of figure in inches

Outputs:
h: dict

Dictionary of plot handles

See also:
Versions:
  • 2015-05-30 @ddalle: Version 1.0

  • 2015-12-14 @ddalle: Added error bars

ProcessComps(comp=None, **kw)

Process list of components

This performs several conversions:

comp

Output

None

DB.Components

str

comp.split(',')

list

comp

Call:
>>> DB.ProcessComps(comp=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

comp: {None} | list | str

Component or list of components

Versions:
  • 2017-04-13 @ddalle: Version 1.0

ReadCaseFM(comp)

Read a CaseFM object

Call:
>>> FM = DB.ReadCaseFM(comp)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: str

Name of component

Outputs:
FM: cape.cfdx.dataBook.CaseFM

Residual history class

Versions:
  • 2017-04-13 @ddalle: First separate version

ReadCaseProp(comp)

Read a CaseProp object

Call:
>>> prop = DB.ReadCaseProp(comp)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: str

Name of component

Outputs:
prop: cape.cfdx.dataBook.CaseProp

Generic-property iterative history instance

Versions:
  • 2022-04-08 @ddalle: Version 1.0

ReadCaseResid()

Read a CaseResid object

Call:
>>> H = DB.ReadCaseResid()
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

Outputs:
H: cape.cfdx.dataBook.CaseResid

Residual history class

Versions:
  • 2017-04-13 @ddalle: First separate version

ReadDBCaseProp(comp, check=False, lock=False)

Initialize data book for one component

Call:
>>> DB.InitDBComp(comp, check=False, lock=False)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

comp: str

Name of component

check: True | {False}

Whether or not to check for LOCK file

lock: True | {False}

Whether or not to create LOCK file

Versions:
  • 2015-11-10 @ddalle: Version 1.0

  • 2017-04-13 @ddalle: Self-contained and renamed

ReadDBComp(comp, check=False, lock=False)

Initialize data book for one component

Call:
>>> DB.InitDBComp(comp, check=False, lock=False)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

comp: str

Name of component

check: True | {False}

Whether or not to check for LOCK file

lock: True | {False}

Whether or not to create LOCK file

Versions:
  • 2015-11-10 @ddalle: Version 1.0

  • 2017-04-13 @ddalle: Self-contained and renamed

ReadDBPyFunc(comp, check=False, lock=False)

Initialize data book for one PyFunc component

Call:
>>> DB.ReadDBPyFunc(comp, check=False, lock=False)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pyCart data book class

comp: str

Name of component

check: True | {False}

Whether or not to check for LOCK file

lock: True | {False}

Whether or not to create LOCK file

Versions:
  • 2022-04-10 @ddalle: Version 1.0

ReadLineLoad(comp, conf=None, targ=None)

Read a line load data

Call:
>>> DB.ReadLineLoad(comp)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the pycart data book class

comp: str

Line load component group

conf: {None} | cape.config.Config

Surface configuration interface

targ: {None} | str

Alternate directory to read from, else DB.targ

Versions:
  • 2015-09-16 @ddalle: Version 1.0

  • 2016-06-27 @ddalle: Added targ

ReadTarget(targ)

Read a data book target if it is not already present

Call:
>>> DB.ReadTarget(targ)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

targ: str

Target name

Versions:
  • 2015-09-16 @ddalle: Version 1.0

ReadTriqFM(comp, check=False, lock=False)

Read a TriqFM data book if not already present

Call:
>>> DB.ReadTriqFM(comp, check=False, lock=False)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Data book instance

comp: str

Name of TriqFM component

check: True | {False}

Whether or not to check LOCK status

lock: True | {False}

If True, wait if the LOCK file exists

Versions:
  • 2017-03-28 @ddalle: Version 1.0

Sort(key=None, I=None)

Sort a data book according to either a key or an index

Call:
>>> DB.Sort()
>>> DB.Sort(key)
>>> DB.Sort(I=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

key: str | list[str]

Name of trajectory key or list of keys on which to sort

I: np.ndarray[int]

List of indices; must have same size as data book

Versions:
  • 2014-12-30 @ddalle: Version 1.0

  • 2015-06-19 @ddalle: New multi-key sort

  • 2016-01-13 @ddalle: Checks to allow incomplete comps

UpdateCaseComp(i, comp)

Update or add a case to a data book

The history of a run directory is processed if either one of three criteria are met.

  1. The case is not already in the data book

  2. The most recent iteration is greater than the data book value

  3. The number of iterations used to create statistics has changed

Call:
>>> n = DB.UpdateCaseComp(i, comp)
Inputs:
DB: pyFun.dataBook.DataBook

Instance of the data book class

i: int

RunMatrix index

comp: str

Name of component

Outputs:
n: 0 | 1

How many updates were made

Versions:
  • 2014-12-22 @ddalle: Version 1.0

  • 2017-04-12 @ddalle: Modified to work one component

  • 2017-04-23 @ddalle: Added output

UpdateCaseProp(I, comp=None)

Update a generic-property databook

Call:
>>> DB.UpdateCaseProp(I, comp=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: {None} | str

Name of TriqFM data book component (default is all)

I: list[int]

List of trajectory indices

Versions:
  • 2022-04-08 @ddalle: Version 1.0

UpdateCasePropCase(i, comp)

Update or add a case to a generic-property data book

The history of a run directory is processed if either one of three criteria are met.

  1. The case is not already in the data book

  2. The most recent iteration is greater than the data book value

  3. The number of iterations used to create statistics has changed

Call:
>>> n = DB.UpdateCasePropCase(i, comp)
Inputs:
DB: DataBook

Instance of the data book class

i: int

RunMatrix index

comp: str

Name of component

Outputs:
n: 0 | 1

How many updates were made

Versions:
  • 2022-04-08 @ddalle: Version 1.0

UpdateCasePropComp(comp, I=None)

Update a component of the generic-property data book

Call:
>>> DB.UpdateCasePropComp(comp, I=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: str

Name of TriqFM data book component

I: {None} | list[int]

List or array of run matrix indices

Versions:
  • 2022-04-08 @ddalle: Version 1.0

UpdateDBPyFunc(I, comp=None)

Update a scalar Python function output databook

Call:
>>> DB.UpdateDBPyFunc(I, comp=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: {None} | str

Name of TriqFM data book component (default is all)

I: list[int]

List of trajectory indices

Versions:
  • 2022-04-10 @ddalle: Version 1.0

UpdateDBPyFuncCase(i, comp)

Update or add a case to a PyFunc data book

The history of a run directory is processed if either one of three criteria are met.

  1. The case is not already in the data book

  2. The most recent iteration is greater than the data book value

  3. The number of iterations used to create statistics has changed

Call:
>>> n = DB.UpdateCasePropCase(i, comp)
Inputs:
DB: DataBook

Instance of the data book class

i: int

RunMatrix index

comp: str

Name of component

Outputs:
n: 0 | 1

How many updates were made

Versions:
  • 2022-04-08 @ddalle: Version 1.0

UpdateDBPyFuncComp(comp, I=None)

Update a PyFunc component of the databook

Call:
>>> DB.UpdateDBPyFuncComp(comp, I=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: str

Name of TriqFM data book component

I: {None} | list[int]

List or array of run matrix indices

Versions:
  • 2022-04-10 @ddalle: Version 1.0

UpdateDataBook(I=None, comp=None)

Update the data book for a list of cases from the run matrix

Call:
>>> DB.UpdateDataBook(I=None, comp=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the data book class

I: list[int] | None

List of trajectory indices to update

comp: {None} | list | str

Component or list of components

Versions:
  • 2014-12-22 @ddalle: Version 1.0

  • 2017-04-12 @ddalle: Split by component

UpdateLineLoad(I, comp=None, conf=None)

Update a line load data book for a list of cases

Call:
>>> n = DB.UpdateLineLoad(I, comp=None, conf=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

I: list[int]

List of trajectory indices

comp: {None} | str

Line load DataBook component or wild card

Outputs:
n: int

Number of cases updated or added

Versions:
  • 2015-09-17 @ddalle: Version 1.0

  • 2016-12-20 @ddalle: Copied to cape

  • 2017-04-25 @ddalle: Added wild cards

UpdateLineLoadComp(comp, I=None, conf=None)

Update a line load data book for a list of cases

Call:
>>> n = DB.UpdateLineLoadComp(comp, conf=None, I=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: str

Name of line load DataBook component

I: {None} | list[int]

List of trajectory indices

qpbs: True | {False}

Whether or not to submit as a script

Outputs:
n: int

Number of cases updated or added

Versions:
  • 2015-09-17 @ddalle: Version 1.0

  • 2016-12-20 @ddalle: Copied to cape

UpdateRunMatrix()

Match the trajectory to the cases in the data book

Call:
>>> DB.UpdateRunMatrix()
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

Versions:
  • 2015-05-22 @ddalle: Version 1.0

UpdateTriqFM(I, comp=None)

Update a TriqFM triangulation-extracted F&M data book

Call:
>>> DB.UpdateTriqFM(I, comp=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: {None} | str

Name of TriqFM data book component (default is all)

I: list[int]

List of trajectory indices

Versions:
  • 2017-03-29 @ddalle: Version 1.0

UpdateTriqFMComp(comp, I=None)

Update a TriqFM triangulation-extracted F&M data book

Call:
>>> DB.UpdateTriqFMComp(comp, I=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: str

Name of TriqFM data book component

I: {None} | list[int]

List or array of run matrix indices

Versions:
  • 2017-03-29 @ddalle: Version 1.0

UpdateTriqPoint(I, comp=None)

Update a TriqPoint triangulation-extracted point sensor data book

Call:
>>> DB.UpdateTriqPoint(I, comp=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

I: list[int]

List or array of run matrix indices

comp: {None} | str

Name of TriqPoint group or all if None

Versions:
  • 2017-10-11 @ddalle: Version 1.0

UpdateTriqPointComp(comp, I=None)

Update a TriqPoint triangulation-extracted data book

Call:
>>> n = DB.UpdateTriqPointComp(comp, I=None)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of data book class

comp: {None} | str

Name of TriqPoint group or all if None

I: {None} | list[int]

List or array of run matrix indices

Outputs:
n: int

Number of updates made

Versions:
  • 2017-10-11 @ddalle: Version 1.0

Write(unlock=True)

Write the current data book in Python memory to file

Call:
>>> DB.Write(unlock=True)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

Versions:
  • 2014-12-22 @ddalle: Version 1.0

  • 2015-06-19 @ddalle: New multi-key sort

  • 2017-06-12 @ddalle: Added unlock

mkdir(fdir)

Create a directory using settings from DataBook>umask

Call:
>>> DB.mkdir(fdir)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

fdir: str

Directory to create

Versions:
  • 2017-09-05 @ddalle: Version 1.0

Individual data books

class cape.cfdx.dataBook.DBBase(comp, cntl, check=False, lock=False, **kw)

Individual item data book basis class

Call:
>>> DBi = DBBase(comp, cntl, check=False, lock=False)
Inputs:
comp: str

Name of the component or other item name

cntl: Cntl

CAPE control class instance

check: True | {False}

Whether or not to check LOCK status

lock: True | {False}

If True, wait if the LOCK file exists

Outputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

Versions:
  • 2014-12-22 @ddalle: Version 1.0

  • 2015-12-04 @ddalle: Forked from DBComp

ArgSort(key=None)

Return indices that would sort a data book by a trajectory key

Call:
>>> I = DBi.ArgSort(key=None)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

key: str

Name of trajectory key to use for sorting; default is first key

Outputs:
I: numpy.ndarray[int]

List of indices; must have same size as data book

Versions:
  • 2014-12-30 @ddalle: Version 1.0

CheckLock()

Check if lock file for this component exists

Call:
>>> q = DBc.CheckLock()
Inputs:
DBc: cape.cfdx.dataBook.DataBookBase

Data book base object

Outputs:
q: bool

Whether or not corresponding LOCK file exists

Versions:
  • 2017-06-12 @ddalle: Version 1.0

EstimateLineCount(fname=None)

Get a conservative (high) estimate of the number of lines in a file

Call:
>>> n, pos = DBP.EstimateLineCount(fname)
Inputs:
DBP: cape.cfdx.dataBook.DBBase

Data book base object

fname: str

Name of data file to read

Outputs:
n: int

Conservative estimate of length of file

pos: int

Position of first data character

Versions:
  • 2016-03-15 @ddalle: Version 1.0

FindCoSweep(x, i, EqCons=[], TolCons={}, GlobCons=[], xkeys={})

Find data book entries meeting constraints seeded from point i

Cases will be considered matches if data book values match trajectory x point i. For example, if we have the following values for EqCons and TolCons have the following values:

EqCons = ["beta"]
TolCons = {"alpha": 0.05, "mach": 0.01}

Then this method will compare DBc[“mach”] to x.mach[i]. Any case such that pass all of the following tests will be included.

abs(DBc["mach"] - x.mach[i]) <= 0.01
abs(DBc["alpha"] - x.alpha[i]) <= 0.05
DBc["beta"] == x.beta[i]

All entries must also meet a list of global constraints from GlobCons. Users can also use xkeys as a dictionary of alternate key names to compare to the trajectory. Consider the following values:

TolCons = {"alpha": 0.05}
xkeys = {"alpha": "AOA"}

Then the test becomes:

abs(DBc["AOA"] - x.alpha[i]) <= 0.05
Call:
>>> J = DBc.FindCoSweep(x, i, EqCons={}, TolCons={}, **kw)
Inputs:
DBc: cape.cfdx.dataBook.DBBase

Data book component instance

x: cape.runmatrix.RunMatrix

RunMatrix (i.e. run matrix) to use for target value

i: int

Index of the case from the trajectory to try match

EqCons: {[]} | list (str)

List of variables that must match the trajectory exactly

TolCons: {{}} | dict[float]

List of variables that may match trajectory within a tolerance

GlobCons: {[]} | list (str)

List of global constraints, see cape.RunMatrix.Filter()

xkeys: {{}} | dict (str)

Dictionary of alternative names of variables

Outputs:
J: numpy.ndarray[int]

Array of indices that match the trajectory within tolerances

See also:
Versions:
  • 2014-12-21 @ddalle: Version 1.0

  • 2016-06-27 @ddalle: Moved from DBTarget and generalized

FindDBMatch(DBc, i)

Find the index of an exact match to case i in another databook

Call:
>>> j = DBi.FindDBMatch(DBc, i)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

Data book base object

DBc: cape.cfdx.dataBook.DBBase

Another data book base object

i: int

Data book index for DBi

Outputs:
j: None | int

Data book index for DBj

Versions:
  • 2017-06-26 @ddalle: Version 1.0

FindMatch(i)

Find an entry by run matrix (trajectory) variables

It is assumed that exact matches can be found. However, trajectory keys that do not affect the name of the folder

Call:
>>> j = DBi.FindMatch(i)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

i: int

Index of the case from the trajectory to try match

Outputs:
j: numpy.ndarray[int]

Array of index that matches the trajectory case or NaN

Versions:
  • 2014-12-22 @ddalle: Version 1.0

FindTargetMatch(DBT, i, topts={}, keylist='tol', **kw)

Find a target entry by run matrix (trajectory) variables

Cases will be considered matches by comparing variables specified in the topts variable, which shares some of the options from the "Targets" subsection of the "DataBook" section of cape.json. Suppose that topts contains the following

{
    "RunMatrix": {"alpha": "ALPHA", "Mach": "MACH"}
    "Tolerances": {
        "alpha": 0.05,
        "Mach": 0.01
    },
    "Keys": ["alpha", "Mach", "beta"]
}

Then any entry in the data book target that matches the Mach number within 0.01 (using a column labeled "MACH") and alpha to within 0.05 is considered a match. Because the Keys parameter contains "beta", the search will also look for exact matches in "beta".

If the Keys parameter is not set, the search will use either all the keys in the trajectory, x.cols, or just the keys specified in the "Tolerances" section of topts. Which of these two default lists to use is determined by the keylist input.

Call:
>>> j = DBc.FindTargetMatch(DBT, i, topts, keylist='x', **kw)
Inputs:
DBc: cape.cfdx.dataBook.DBBase | DBTarget

Instance of original databook

DBT: DBBase | DBTarget

Target databook of any type

i: int

Index of the case either from DBc.x for DBT.x to match

topts: dict | cape.cfdx.options.DataBook.DBTarget

Criteria used to determine a match

keylist: {"x"} | "tol"

Default test key source: x.cols or topts.Tolerances

source: "self" | {"target"}

Match DBc.x case i if "self", else DBT.x case i

Outputs:
j: numpy.ndarray[int]

Array of indices that match the trajectory within tolerances

See also:
Versions:
  • 2014-12-21 @ddalle: Version 1.0

  • 2016-06-27 @ddalle: Moved from DBTarget and generalized

  • 2018-02-12 @ddalle: Changed first input to DBBase

GetCoeff(comp, coeff, I, **kw)

Get a coefficient value for one or more cases

Call:
>>> v = DBT.GetCoeff(comp, coeff, i)
>>> V = DBT.GetCoeff(comp, coeff, I)
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the Cape data book target class

comp: str

Component whose coefficient is being plotted

coeff: str

Coefficient being plotted

i: int

Individual case/entry index

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Outputs:
v: float

Scalar value from the appropriate column

V: np..ndarray

Array of values from the appropriate column

Versions:
  • 2018-02-12 @ddalle: Version 1.0

GetDeltaStats(DBT, comp, coeff, I, topts={}, **kw)

Calculate statistics on differences between two databooks

Call:
>>> S = DBc.GetDeltaStats(DBT, coeff, I, topts=None, **kw)
Inputs:
DBc: cape.cfdx.dataBook.DBBase

Component databook

coeff: str

Name of coefficient on which to compute statistics

I: list[int]

Indices of cases/entries to consider

topts: {{}} | dict

Dictionary of tolerances for variables in question

keylist: {"x"} | "tol"

Default test key source: x.cols or topts.Tolerances

CombineTarget: {True} | False

For cases with multiple matches, compare to mean target value

Outputs:
S: dict

Dictionary of statistical results

S[“delta”]: np.ndarray

Array of deltas for each valid case

S[“n”]: int

Number

S[“mu”]: float

Mean of histogram

Versions:
  • 2018-02-12 @ddalle: Version 1.0

GetLockFile()

Get the name of the potential lock file

Call:
>>> flock = DBc.GetLockFile()
Inputs:
DBc: cape.cfdx.dataBook.DataBookBase

Data book base object

Outputs:
flock: str

Full path to potential lock file

Versions:
  • 2017-06-12 @ddalle: Version 1.0

GetRunMatrixIndex(j)

Find an entry in the run matrix (trajectory)

Call:
>>> i = DBi.GetRunMatrixIndex(self, j)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

j: int

Index of the case from the databook to try match

Outputs:
i: int

RunMatrix index or None

Versions:
  • 2015-05-28 @ddalle: Version 1.0

Lock()

Write a ‘LOCK’ file for a data book component

Call:
>>> DBc.Lock()
Inputs:
DBc: cape.cfdx.dataBook.DataBookBase

Data book base object

Versions:
  • 2017-06-12 @ddalle: Version 1.0

Merge(DBc)

Merge another copy of the data book object

Call:
>>> DBi.Merge(DBc)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

Component data book

DBc: cape.cfdx.dataBook.DBBase

Copy of component data book, perhaps read at a different time

Versions:
  • 2017-06-26 @ddalle: Version 1.0

PlotCoeff(coeff, I, **kw)

Plot a sweep of one coefficient over several cases

Call:
>>> h = DBi.PlotCoeff(coeff, I, **kw)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
Outputs:
h: dict

Dictionary of plot handles

Versions:
  • 2015-05-30 @ddalle: Version 1.0

  • 2015-12-14 @ddalle: Added error bars

PlotCoeffBase(coeff, I, **kw)

Plot a sweep of one coefficient or quantity over several cases

This is the base method upon which data book sweep plotting is built. Other methods may call this one with modifications to the default settings. For example cape.cfdx.dataBook.DBTarget.PlotCoeff() changes the default LineOptions to show a red line instead of the standard black line. All settings can still be overruled by explicit inputs to either this function or any of its children.

Call:
>>> h = DBi.PlotCoeffBase(coeff, I, **kw)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
x: {None} | str

RunMatrix key for x axis (or plot against index if None)

Label: {comp} | str

Manually specified label

Legend: {True} | False

Whether or not to use a legend

StDev: {None} | float

Multiple of iterative history standard deviation to plot

MinMax: {False} | True

Whether to plot minimum and maximum over iterative history

Uncertainty: {False} | True

Whether to plot direct uncertainty

LineOptions: dict

Plot options for the primary line(s)

StDevOptions: dict

Dictionary of plot options for the standard deviation plot

MinMaxOptions: dict

Dictionary of plot options for the min/max plot

UncertaintyOptions: dict

Dictionary of plot options for the uncertainty plot

FigWidth: float

Width of figure in inches

FigHeight: float

Height of figure in inches

PlotTypeStDev: {'FillBetween'} | 'ErrorBar'

Plot function to use for standard deviation plot

PlotTypeMinMax: {'FillBetween'} | 'ErrorBar'

Plot function to use for min/max plot

PlotTypeUncertainty: 'FillBetween' | {'ErrorBar'}

Plot function to use for uncertainty plot

LegendFontSize: {9} | int > 0 | float

Font size for use in legends

Grid: {None} | True | False

Turn on/off major grid lines, or leave as is if None

GridStyle: {{}} | dict

Dictionary of major grid line line style options

MinorGrid: {None} | True | False

Turn on/off minor grid lines, or leave as is if None

MinorGridStyle: {{}} | dict

Dictionary of minor grid line line style options

Outputs:
h: dict

Dictionary of plot handles

Versions:
  • 2015-05-30 @ddalle: Version 1.0

  • 2015-12-14 @ddalle: Added error bars

PlotContour(coeff, I, **kw)

Create a contour plot for a subset of cases

Call:
>>> h = DBi.PlotContour(coeff, I, **kw)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
Outputs:
h: dict

Dictionary of plot handles

Versions:
  • 2017-04-17 @ddalle: Version 1.0

PlotContourBase(coeff, I, **kw)

Create a contour plot of selected data points

Call:
>>> h = DBi.PlotContourBase(coeff, I, **kw)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
x: str

RunMatrix key for x axis

y: str

RunMatrix key for y axis

ContourType: {“tricontourf”} | “tricontour” | “tripcolor”

Contour plotting function to use

LineType: {“plot”} | “triplot” | “none”

Line plotting function to highlight data points

Label: [ {comp} | str ]

Manually specified label

ColorMap: {"jet"} | str

Name of color map to use

ColorBar: [ {True} | False ]

Whether or not to use a color bar

ContourOptions: dict

Plot options to pass to contour plotting function

LineOptions: dict

Plot options for the line plot

FigWidth: float

Width of figure in inches

FigHeight: float

Height of figure in inches

Outputs:
h: dict

Dictionary of plot handles

Versions:
  • 2017-04-17 @ddalle: Version 1.0

PlotHist(coeff, I, **kw)

Plot a histogram over several cases

Call:
>>> h = DBi.PlotValueHist(coeff, I, **kw)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
Outputs:
h: dict

Dictionary of plot handles

Versions:
  • 2016-04-04 @ddalle: Version 1.0

PlotHistBase(coeff, I, **kw)

Plot a histogram of one coefficient over several cases

Call:
>>> h = DBi.PlotHistBase(coeff, I, **kw)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
FigWidth: float

Figure width

FigHeight: float

Figure height

Label: [ {comp} | str ]

Manually specified label

Target: {None} | DBBase | list

Target database or list thereof

TargetValue: float | list[float]

Target or list of target values

TargetLabel: str | list (str)

Legend label(s) for target(s)

StDev: [ {None} | float ]

Multiple of iterative history standard deviation to plot

HistOptions: dict

Plot options for the primary histogram

StDevOptions: dict

Dictionary of plot options for the standard deviation plot

DeltaOptions: dict

Options passed to plt.plot() for reference range plot

MeanOptions: dict

Options passed to plt.plot() for mean line

TargetOptions: dict

Options passed to plt.plot() for target value lines

OutlierSigma: {7.0} | float

Standard deviation multiplier for determining outliers

ShowMu: bool

Option to print value of mean

ShowSigma: bool

Option to print value of standard deviation

ShowError: bool

Option to print value of sampling error

ShowDelta: bool

Option to print reference value

ShowTarget: bool

Option to show target value

MuFormat: {"%.4f"} | str

Format for text label of the mean value

DeltaFormat: {"%.4f"} | str

Format for text label of the reference value d

SigmaFormat: {"%.4f"} | str

Format for text label of the iterative standard deviation

TargetFormat: {"%.4f"} | str

Format for text label of the target value

XLabel: str

Specified label for x-axis, default is Iteration Number

YLabel: str

Specified label for y-axis, default is c

Outputs:
h: dict

Dictionary of plot handles

Versions:
  • 2015-05-30 @ddalle: Version 1.0

  • 2015-12-14 @ddalle: Added error bars

  • 2016-04-04 @ddalle: Moved from point sensor to data book

PlotRangeHist(coeff, I, **kw)

Plot a range histogram over several cases

Call:
>>> h = DBi.PlotRangeHist(coeff, I, **kw)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
Outputs:
h: dict

Dictionary of plot handles

Versions:
  • 2016-04-04 @ddalle: Version 1.0

PlotRangeHistBase(coeff, I, **kw)

Plot a range histogram of one coefficient over several cases

Call:
>>> h = DBi.PlotRangeHistBase(coeff, I, **kw)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
FigWidth: float

Figure width

FigHeight: float

Figure height

Label: {comp} | str

Manually specified label

Target: DBBase | list

Target database or list thereof

TargetValue: float | list[float]

Target or list of target values

TargetLabel: str | list (str)

Legend label(s) for target(s)

StDev: {3.6863} | None | float

Multiple of iterative history standard deviation to plot

HistOptions: dict

Plot options for the primary histogram

StDevOptions: dict

Dictionary of plot options for the standard deviation plot

DeltaOptions: dict

Options passed to plt.plot() for reference range plot

TargetOptions: dict

Options passed to plt.plot() for target value lines

OutlierSigma: {3.6863} | float

Standard deviation multiplier for determining outliers

ShowMu: bool

Option to print value of mean

ShowSigma: bool

Option to print value of standard deviation

ShowDelta: bool

Option to print reference value

ShowTarget: bool

Option to show target value

MuFormat: {"%.4f"} | str

Format for text label of the mean value

DeltaFormat: {"%.4f"} | str

Format for text label of the reference value d

SigmaFormat: {"%.4f"} | str

Format for text label of the iterative standard deviation

TargetFormat: {"%.4f"} | str

Format for text label of the target value

XLabel: str

Specified label for x-axis, default is Iteration Number

YLabel: str

Specified label for y-axis, default is c

Outputs:
h: dict

Dictionary of plot handles

Versions:
  • 2015-05-30 @ddalle: Version 1.0

  • 2015-12-14 @ddalle: Added error bars

  • 2016-04-04 @ddalle: Moved from point sensor to data book

ProcessColumns()

Process column names

Call:
>>> DBi.ProcessColumns()
Inputs:
DBi: cape.cfdx.dataBook.DBBase

Data book base object

Effects:
DBi.xCols: list (str)

List of trajectory keys

DBi.fCols: list (str)

List of floating point data columns

DBi.iCols: list (str)

List of integer data columns

DBi.cols: list (str)

Total list of columns

DBi.nxCol: int

Number of trajectory keys

DBi.nfCol: int

Number of floating point keys

DBi.niCol: int

Number of integer data columns

DBi.nCol: int

Total number of columns

Versions:
  • 2016-03-15 @ddalle: Version 1.0

ProcessConverters()

Process the list of converters to read and write each column

Call:
>>> DBP.ProcessConverters()
Inputs:
DBP: cape.cfdx.dataBook.DataBookBase

Data book base object

Effects:
DBP.rconv: list (function)

List of read converters

DBP.wflag: list (%i | %.12g | %s)

List of write flags

Versions:
  • 2016-03-15 @ddalle: Version 1.0

Read(fname=None, check=False, lock=False)

Read a data book statistics file

Call:
>>> DBc.Read()
>>> DBc.Read(fname, check=False, lock=False)
Inputs:
DBc: cape.cfdx.dataBook.DBBase

Data book base object

fname: str

Name of data file to read

check: True | {False}

Whether or not to check LOCK status

lock: True | {False}

If True, wait if the LOCK file exists

Versions:
  • 2015-12-04 @ddalle: Version 1.0

  • 2017-06-12 @ddalle: Added lock

ReadCopy(check=False, lock=False)

Read a copied database object

Call:
>>> DBc1 = DBc.ReadCopy(check=False, lock=False)
Inputs:
DBc: cape.cfdx.dataBook.DBBase

Data book base object

check: True | {False}

Whether or not to check LOCK status

lock: True | {False}

If True, wait if the LOCK file exists

Outputs:
DBc1: cape.cfdx.dataBook.DBBase

Copy of data book base object

Versions:
  • 2017-06-26 @ddalle: Version 1.0

Sort(key=None, I=None)

Sort a data book according to either a key or an index

Call:
>>> DBi.Sort()
>>> DBi.Sort(key)
>>> DBi.Sort(I=None)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

key: str

Name of trajectory key to use for sorting; default is first key

I: numpy.ndarray[int]

List of indices; must have same size as data book

Versions:
  • 2014-12-30 @ddalle: Version 1.0

  • 2017-04-18 @ddalle: Using np.lexsort()

TouchLock()

Touch a ‘LOCK’ file for a data book component to reset its mod time

Call:
>>> DBc.TouchLock()
Inputs:
DBc: cape.cfdx.dataBook.DataBookBase

Data book base object

Versions:
  • 2017-06-14 @ddalle: Version 1.0

TransformDBFM(topts, mask=None)

Transform force and moment coefficients

Available transformations and their parameters are

  • “Euler123”: “phi”, “theta”, “psi”

  • “Euler321”: “psi”, “theta”, “phi”

  • “ScaleCoeffs”: “CA”, “CY”, “CN”, “CLL”, “CLM”, “CLN”

Other variables (columns) in the databook are used to specify values to use for the transformation variables. For example,

topts = {
    "Type": "Euler321",
    "psi": "Psi",
    "theta": "Theta",
    "phi": "Phi",
}

will cause this function to perform a reverse Euler 3-2-1 transformation using dbc[“Psi”], dbc[“Theta”], and dbc[“Phi”] as the angles.

Coefficient scaling can be used to fix incorrect reference areas or flip axes. The default is actually to flip CLL and CLN due to the transformation from CFD axes to standard flight dynamics axes.

topts = {
    "Type": "ScaleCoeffs",
    "CLL": -1.0,
    "CLN": -1.0,
}
Call:
>>> dbc.TransformDBFM(topts, mask=None)
Inputs:
dbc: DBBase

Instance of the force and moment class

topts: dict

Dictionary of options for the transformation

mask: {None} | np.ndarray[int]

Optional subset of cases to transform

Versions:
  • 2021-11-18 @ddalle: Version 1.0

Unlock()

Delete the LOCK file if it exists

Call:
>>> DBc.Unlock()
Inputs:
DBc: cape.cfdx.dataBook.DataBookBase

Data book base object

Versions:
  • 2017-06-12 @ddalle: Version 1.0

UpdateRunMatrix()

Match the trajectory to the cases in the data book

Call:
>>> DBi.UpdateRunMatrix()
Inputs:
DBi: cape.cfdx.dataBook.DBBase

Component data book

Versions:
  • 2017-04-18 @ddalle: Version 1.0

Write(fname=None, merge=False, unlock=True)

Write a single data book summary file

Call:
>>> DBi.Write()
>>> DBi.Write(fname, merge=False, unlock=True)
Inputs:
DBi: cape.cfdx.dataBook.DBBase

An individual item data book

fname: str

Name of data file to read

merge: True | {False}

Whether or not to attempt a merger before writing

unlock: {True} | False

Whether or not to delete any lock files

Versions:
  • 2015-12-04 @ddalle: Version 1.0

  • 2017-06-12 @ddalle: Added unlock

  • 2017-06-26 @ddalle: Added merge

mkdir(fdir)

Create a directory using settings from DataBook>umask

Call:
>>> DB.mkdir(fdir)
Inputs:
DB: cape.cfdx.dataBook.DataBook

Instance of the Cape data book class

fdir: str

Directory to create

Versions:
  • 2017-09-05 @ddalle: Version 1.0

class cape.cfdx.dataBook.DBComp(comp, cntl, targ=None, check=False, lock=False, **kw)

Individual force & moment component data book

This class is derived from cape.cfdx.dataBook.DBBase.

Call:
>>> DBi = DBComp(comp, cntl, targ=None, check=None, lock=None)
Inputs:
comp: str

Name of the component

cntl: Cntl

CAPE control class instance

targ: {None} | str

If used, read a duplicate data book as a target named targ

check: True | {False}

Whether or not to check LOCK status

lock: True | {False}

If True, wait if the LOCK file exists

Outputs:
DBi: cape.cfdx.dataBook.DBComp

An individual component data book

Versions:
  • 2014-12-20 @ddalle: Started

  • 2014-12-22 @ddalle: Version 1.0

  • 2016-06-27 @ddalle: Added target option for using other folders

class cape.cfdx.dataBook.DBTarget(targ, x, opts, RootDir=None)

Class to handle data from data book target files. There are more constraints on target files than the files that data book creates, and raw data books created by pyCart are not valid target files.

Call:
>>> DBT = DBTarget(targ, x, opts, RootDir=None)
Inputs:
targ: cape.cfdx.options.DataBook.DBTarget

Instance of a target source options interface

x: pyCart.runmatrix.RunMatrix

Run matrix interface

opts: cape.cfdx.options.Options

Options interface

RootDir: str

Root directory, defaults to os.getcwd()

Outputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the Cape data book target class

Versions:
  • 2014-12-20 @ddalle: Started

  • 2015-01-10 @ddalle: Version 1.0

  • 2015-12-14 @ddalle: Added uncertainties

CheckColumn(ctargs, pt, cf, sfx)

Check a data book target column name and its consistency

Call:
>>> fi = DBT.CheckColumn(ctargs, pt, c)
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the data book target class

ctargs: dict

Dictionary of target column names for each coefficient

pt: str

Name of subcomponent (short for ‘point’)

c: str

Name of the coefficient in question, including suffix

Outputs:
fi: None | str

Name of the column in data book if present

Versions:
  • 2015-12-14 @ddalle: Version 1.0

FindMatch(DBc, i)

Find an entry by run matrix (trajectory) variables

Cases will be considered matches by comparing variables specified in the DataBook section of cape.json as cases to compare against. Suppose that the control file contains the following.

"DataBook": {
    "Targets": {
        "Experiment": {
            "File": "WT.dat",
            "RunMatrix": {"alpha": "ALPHA", "Mach": "MACH"}
            "Tolerances": {
                "alpha": 0.05,
                "Mach": 0.01
            }
        }
    }
}

Then any entry in the data book target that matches the Mach number within 0.01 (using a column labeled MACH) and alpha to within 0.05 is considered a match. If there are more trajectory variables, they are not used for this filtering of matches.

Call:
>>> j = DBT.FindMatch(x, i)
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the Cape data book target data carrier

x: cape.runmatrix.RunMatrix

The current pyCart trajectory (i.e. run matrix)

i: int

Index of the case from the trajectory to try match

Outputs:
j: numpy.ndarray[int]

Array of indices that match the trajectory within tolerances

See also:
Versions:
  • 2014-12-21 @ddalle: Version 1.0

  • 2016-06-27 @ddalle: Moved guts to DBBase

  • 2018-02-12 @ddalle: Moved first input to DBBase

GetCoeff(comp, coeff, I, **kw)

Get a coefficient value for one or more cases

Call:
>>> v = DBT.GetCoeff(comp, coeff, i)
>>> V = DBT.GetCoeff(comp, coeff, I)
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the Cape data book target class

comp: str

Component whose coefficient is being plotted

coeff: str

Coefficient being plotted

i: int

Individual case/entry index

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Outputs:
v: float

Scalar value from the appropriate column

V: np..ndarray

Array of values from the appropriate column

Versions:
  • 2018-02-12 @ddalle: Version 1.0

PlotCoeff(comp, coeff, I, **kw)

Plot a sweep of one coefficient over several cases

Call:
>>> h = DBT.PlotCoeff(comp, coeff, I, **kw)
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the Cape data book target class

comp: str

Component whose coefficient is being plotted

coeff: str

Coefficient being plotted

I: numpy.ndarray[int]

List of indexes of cases to include in sweep

Keyword Arguments:
x: [ {None} | str ]

RunMatrix key for x axis (or plot against index if None)

Label: [ {comp} | str ]

Manually specified label

Legend: [ {True} | False ]

Whether or not to use a legend

StDev: [ {None} | float ]

Multiple of iterative history standard deviation to plot

MinMax: [ {False} | True ]

Whether to plot minimum and maximum over iterative history

Uncertainty: [ {False} | True ]

Whether to plot direct uncertainty

LineOptions: dict

Plot options for the primary line(s)

StDevOptions: dict

Dictionary of plot options for the standard deviation plot

MinMaxOptions: dict

Dictionary of plot options for the min/max plot

UncertaintyOptions: dict

Dictionary of plot options for the uncertainty plot

FigWidth: float

Width of figure in inches

FigHeight: float

Height of figure in inches

PlotTypeStDev: [ {‘FillBetween’} | ‘ErrorBar’ ]

Plot function to use for standard deviation plot

PlotTypeMinMax: [ {‘FillBetween’} | ‘ErrorBar’ ]

Plot function to use for min/max plot

PlotTypeUncertainty: [ ‘FillBetween’ | {‘ErrorBar’} ]

Plot function to use for uncertainty plot

Outputs:
h: dict

Dictionary of plot handles

Versions:
  • 2015-05-30 @ddalle: Version 1.0

  • 2015-12-14 @ddalle: Added uncertainties

ProcessColumns()

Process data columns and split into dictionary keys

Call:
>>> DBT.ProcessColumns()
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the data book target class

Versions:
  • 2015-06-03 @ddalle: Copied from __init__() method

  • 2015-12-14 @ddalle: Added support for point sensors

ReadAllData(fname, delimiter=',', skiprows=0)

Read target data file all at once

Call:
>>> DBT.ReadAllData(fname, delimiter=",", skiprows=0)
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the Cape data book target class

fname: str

Name of file to read

delimiter: str

Data delimiter character(s)

skiprows: int

Number of header rows to skip

Versions:
  • 2015-09-07 @ddalle: Version 1.0

ReadData()

Read data file according to stored options

Call:
>>> DBT.ReadData()
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the data book target class

Versions:
  • 2015-06-03 @ddalle: Copied from __init__() method

ReadDataByColumn(fname, delimiter=',', skiprows=0)

Read target data one column at a time

Call:
>>> DBT.ReadDataByColumn(fname, delimiter=",", skiprows=0)
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the Cape data book target class

fname: str

Name of file to read

delimiter: str

Data delimiter character(s)

skiprows: int

Number of header rows to skip

Versions:
  • 2015-09-07 @ddalle: Version 1.0

UpdateRunMatrix()

Match the trajectory to the cases in the data book

Call:
>>> DBT.UpdateRunMatrix()
Inputs:
DBT: cape.cfdx.dataBook.DBTarget

Instance of the data book target class

Versions:
  • 2015-06-03 @ddalle: Version 1.0

class cape.cfdx.dataBook.DBTriqFM(x, opts, comp, **kw)

Force and moment component extracted from surface triangulation

Call:
>>> DBF = DBTriqFM(x, opts, comp, RootDir=None)
Inputs:
x: cape.runmatrix.RunMatrix

RunMatrix/run matrix interface

opts: cape.cfdx.options.Options

Options interface

comp: str

Name of TriqFM component

RootDir: {None} | st

Root directory for the configuration

check: True | {False}

Whether or not to check LOCK status

lock: True | {False}

If True, wait if the LOCK file exists

Outputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Versions:
  • 2017-03-28 @ddalle: Version 1.0

ApplyTransformations(i, FM)

Apply transformations to forces and moments

Call:
>>> FM = DBF.ApplyTransformations(i, FM)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

i: int

Case index

FM: dict (dict[float])

Dictionary of force & moment coefficients

Outputs:
FM: dict (dict[float])

Dictionary of transformed force & moment coefficients

Versions:
  • 2017-03-29 @ddalle: Version 1.0

GetCompID(patch)

Get the component ID name(s) or number(s) to use for each patch

Call:
>>> compID = DBF.GetCompID(patch)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

patch: str

Name of patch

Outputs:
compID: {patch} | str | int | list

Name, number, or list thereof of patch in map tri file

Versions:
  • 2017-03-28 @ddalle: Version 1.0

GetConditions(i)

Get the freestream conditions needed for forces

Call:
>>> xi = DBF.GetConditions(i)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

i: int

Case index

Outputs:
xi: dict

Dictionary of Mach number (mach), Reynolds number (Re)

Versions:
  • 2017-03-28 @ddalle: Version 1.0

GetDimensionalForces(patch, i, FM)

Get dimensional forces

This dimensionalizes any force or moment coefficient already in FM replacing the first character 'C' with 'F'. For example, "FA" is the dimensional axial force from "CA", and "FAv" is the dimensional axial component of the viscous force

Call:
>>> FM = DBF.GetDimensionalForces(patch, i, FM)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

patch: str

Name of patch

i: int

Case index

FM: dict[float]

Dictionary of force & moment coefficients

Outputs:
FM: dict[float]

Dictionary of force & moment coefficients

Versions:
  • 2017-03-29 @ddalle: Version 1.0

GetPatchCompIDs()

Get the list of component IDs mapped from the template tri

Call:
>>> CompIDs = DBF.GetPatchCompIDs()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Outputs:
CompIDs: list[int] | None

List of component IDs that came from the mapping file

Versions:
  • 2017-03-30 @ddalle: Version 1.0

GetRefComponent()

Get the first component

Call:
>>> DBc = DBF.GetRefComponent()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Outputs:
DBc: cape.cfdx.dataBook.DBComp

Data book for one component

Versions:
  • 2016-08-18 @ddalle: Version 1.0

  • 2017-04-05 @ddalle: Had to customize for TriqFM

GetStateVars(patch, FM)

Get additional state variables, such as minimum Cp

Call:
>>> FM = DBF.GetStateVars(patch, FM)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

patch: str

Name of patch

FM: dict[float]

Dictionary of force & moment coefficients

Outputs:
FM: dict[float]

Dictionary of force & moment coefficients

Versions:
  • 2017-03-28 @ddalle: Version 1.0

GetTriqFile()

Get most recent triq file and its associated iterations

Call:
>>> qtriq, ftriq, n, i0, i1 = DBF.GetTriqFile()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Outputs:
qtriq: {False}

Whether or not to convert file from other format

ftriq: str

Name of triq file

n: int

Number of iterations included

i0: int

First iteration in the averaging

i1: int

Last iteration in the averaging

Versions:
  • 2016-12-19 @ddalle: Added to the module

GetTriqForces(i, **kw)

Get the forces, moments, and other states on each patch

Call:
>>> FM = DBF.GetTriqForces(i)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

i: int

Case index

Outputs:
FM: dict (dict[float])

Dictionary of force & moment dictionaries for each patch

Versions:
  • 2017-03-28 @ddalle: Version 1.0

GetTriqForcesPatch(patch, i, **kw)

Get the forces and moments on a patch

Call:
>>> FM = DBF.GetTriqForces(patch, i, **kw)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

patch: str

Name of patch

i: int

Case index

Outputs:
FM: dict[float]

Dictionary of force & moment coefficients

Versions:
  • 2017-03-28 @ddalle: Version 1.0

Lock()

Lock the data book component

Call:
>>> DBF.Lock()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Versions:
  • 2017-06-12 @ddalle: Version 1.0

MapTriCompID()

Perform any component ID mapping if necessary

Call:
>>> DBF.MapTriCompID()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Attributes:
DBF.compmap: dict

Map of component numbers altered during the mapping

Versions:
  • 2017-03-28 @ddalle: Version 1.0

Merge(DBF1)

Sort point sensor group

Call:
>>> DBF.Merge(DBF1)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

DBF1: cape.cfdx.dataBook.DBTriqFM

Another instance of related TriqFM data book

Versions:
  • 2016-06-26 @ddalle: Version 1.0

PreprocessTriq(ftriq, **kw)

Perform any necessary preprocessing to create triq file

Call:
>>> ftriq = DBF.PreprocessTriq(ftriq, qpbs=False, f=None)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

ftriq: str

Name of triq file

i: {None} | int

Case index

Versions:
  • 2016-12-19 @ddalle: Version 1.0

  • 2016-12-21 @ddalle: Added PBS

ReadCopy(check=False, lock=False)

Read a copied database object

Call:
>>> DBF1 = DBF.ReadCopy(check=False, lock=False)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

check: True | {False}

Whether or not to check LOCK status

lock: True | {False}

If True, wait if the LOCK file exists

Outputs:
DBF1: cape.cfdx.dataBook.DBTriqFM

Another instance of related TriqFM data book

Versions:
  • 2017-06-26 @ddalle: Version 1.0

ReadTriMap()

Read the triangulation to use for mapping

Call:
>>> DBF.ReadTriMap()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Versions:
  • 2017-03-28 @ddalle: Version 1.0

ReadTriq(ftriq)

Read a triq annotated surface triangulation

Call:
>>> DBF.ReadTriq(ftriq)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

ftriq: str

Name of triq file

Versions:
  • 2017-03-28 @ddalle: Version 1.0

SelectTriq()

Select the components of triq that are mapped patches

Call:
>>> triq = DBF.SelectTriq()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Outputs:
triq: cape.tri.Triq

Interface to annotated surface triangulation

Versions:
  • 2017-03-30 @ddalle: Version 1.0

Sort()

Sort point sensor group

Call:
>>> DBF.Sort()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Versions:
  • 2016-03-08 @ddalle: Version 1.0

TouchLock()

Touch a ‘LOCK’ file for a data book component to reset its mod time

Call:
>>> DBF.TouchLock()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Versions:
  • 2017-06-14 @ddalle: Version 1.0

TransformFM(FM, topts, i)

Transform a force and moment history

Available transformations and their parameters are listed below.

  • “Euler321”: “psi”, “theta”, “phi”

  • “ScaleCoeffs”: “CA”, “CY”, “CN”, “CLL”, “CLM”, “CLN”

RunMatrix variables are used to specify values to use for the transformation variables. For example,

topts = {"Type": "Euler321",
    "psi": "Psi", "theta": "Theta", "phi": "Phi"}

will cause this function to perform a reverse Euler 3-2-1 transformation using x.Psi[i], x.Theta[i], and x.Phi[i] as the angles.

Coefficient scaling can be used to fix incorrect reference areas or flip axes. The default is actually to flip CLL and CLN due to the transformation from CFD axes to standard flight dynamics axes.

topts = {"Type": "ScaleCoeffs",
    "CLL": -1.0, "CLN": -1.0}
Call:
>>> FM.TransformFM(topts, x, i)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the force and moment class

topts: dict

Dictionary of options for the transformation

x: cape.runmatrix.RunMatrix

The run matrix used for this analysis

i: int

The index of the case to in the current run matrix

Versions:
  • 2014-12-22 @ddalle: Version 1.0

Triq2Plt(triq, **kw)

Convert an annotated tri (TRIQ) interface to Tecplot (PLT)

Call:
>>> plt = DBF.Triq2Plt(triq, **kw)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

triq: cape.tri.Triq

Interface to annotated surface triangulation

i: {None} | int

Index number if needed

t: {1.0} | float

Time step or iteration number

Outputs:
plt: cape.plt.Plt

Binary Tecplot interface

Versions:
  • 2017-03-30 @ddalle: Version 1.0

Unlock()

Unlock the data book component (delete lock file)

Call:
>>> DBF.Unlock()
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Versions:
  • 2017-06-12 @ddalle: Version 1.0

UpdateCase(i)

Prepare to update a TriqFM group if necessary

Call:
>>> n = DBF.UpdateCase(i)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

i: int

Case index

Outputs:
n: 0 | 1

How many updates were made

Versions:
  • 2017-03-28 @ddalle: Version 1.0

Write(merge=False, unlock=True)

Write to file each point sensor data book in a group

Call:
>>> DBF.Write(merge=False, unlock=True)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

merge: True | {False}

Whether or not to reread data book and merge before writing

unlock: {True} | False

Whether or not to delete any lock file

Versions:
  • 2015-12-04 @ddalle: Version 1.0

  • 2017-06-26 @ddalle: Version 1.0

WriteTriq(i, **kw)

Write mapped solution as TRIQ or Tecplot file with zones

Call:
>>> DBF.WriteTriq(i, **kw)
Inputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

i: int

Case index

t: {1} | float

Iteration number

Versions:
  • 2017-03-30 @ddalle: Version 1.0

class cape.cfdx.dataBook.DBTriqFMComp(x, opts, comp, patch=None, **kw)

Force and moment component extracted from surface triangulation

Call:
>>> DBF = DBTriqFM(x, opts, comp, RootDir=None)
Inputs:
x: cape.runmatrix.RunMatrix

RunMatrix/run matrix interface

opts: cape.cfdx.options.Options

Options interface

comp: str

Name of TriqFM component

RootDir: {None} | st

Root directory for the configuration

check: True | {False}

Whether or not to check LOCK status

lock: True | {False}

If True, wait if the LOCK file exists

Outputs:
DBF: cape.cfdx.dataBook.DBTriqFM

Instance of TriqFM data book

Versions:
  • 2017-03-28 @ddalle: Version 1.0

Data book classes for individual cases

class cape.cfdx.dataBook.CaseData

Base class for case iterative histories

Call:
>>> FM = CaseData()
Outputs:
FM: cape.cfdx.dataBook.CaseData

Base iterative history class

Versions:
  • 2015-12-07 @ddalle: Version 1.0

ExtractValue(c, col=None, **kw)

Extract the iterative history for one coefficient/state

This function may be customized for some modules

Call:
>>> C = FM.Extractvalue(c)
>>> C = FM.ExtractValue(c, col=None)
Inputs:
FM: cape.cfdx.dataBook.CaseData

Case component history class

c: str

Name of state

col: {None} | int

Column number

Outputs:
C: np.ndarray

Array of values for c at each iteration or sample interval

Versions:
  • 2015-12-07 @ddalle: Version 1.0

GetIterationIndex(i)

Return index of a particular iteration in FM.i

If the iteration i is not present in the history, the index of the last available iteration less than or equal to i is returned.

Call:
>>> j = FM.GetIterationIndex(i)
Inputs:
FM: cape.cfdx.dataBook.CaseData

Case component history class

i: int

Iteration number

Outputs:
j: int

Index of last iteration in FM.i less than or equal to i

Versions:
  • 2015-03-06 @ddalle: Version 1.0

  • 2015-12-07 @ddalle: Copied from CaseFM

PlotValue(c, col=None, n=None, **kw)

Plot an iterative history of some value named c

Call:
>>> h = FM.PlotValue(c, n=None, **kw)
Inputs:
FM: cape.cfdx.dataBook.CaseData

Case component history class

c: str

Name of coefficient to plot, e.g. 'CA'

col: str | int | None

Select a column by name or index

n: int

Only show the last n iterations

nMin: {0} | int

First iteration allowed for use in averaging

nAvg, nStats: {100} | int

Use at least the last nAvg iterations to compute an average

dnAvg, dnStats: {nStats} | int

Use intervals of dnStats iterations for candidate windows

nMax, nMaxStats: {nStats} | int

Use at most nMax iterations

d: float

Delta in the coefficient to show expected range

k: float

Multiple of iterative standard deviation to plot

u: float

Multiple of sampling error standard deviation to plot

err: float

Fixed sampling error, def uses util.SearchSinusoidFit()

nLast: int

Last iteration to use (defaults to last iteration available)

nFirst: int

First iteration to plot

FigWidth: float

Figure width

FigHeight: float

Figure height

LineOptions: dict

Dictionary of additional options for line plot

StDevOptions: dict

Options passed to plt.fill_between() for stdev plot

ErrPltOptions: dict

Options passed to plt.fill_between() for uncertainty plot

DeltaOptions: dict

Options passed to plt.plot() for reference range plot

MeanOptions: dict

Options passed to plt.plot() for mean line

ShowMu: bool

Option to print value of mean

ShowSigma: bool

Option to print value of standard deviation

ShowError: bool

Option to print value of sampling error

ShowDelta: bool

Option to print reference value

MuFormat: {"%.4f"} | str

Format for text label of the mean value

DeltaFormat: {"%.4f"} | str

Format for text label of the reference value d

SigmaFormat: {"%.4f"} | str

Format for text label of the iterative standard deviation

ErrorFormat: {"%.4f"} | str

Format for text label of the sampling error

XLabel: str

Specified label for x-axis, default is I"teration Number"

YLabel: str

Specified label for y-axis, default is c

Grid: {None} | True | False

Turn on/off major grid lines, or leave as is if None

GridStyle: {{}} | dict

Dictionary of major grid line line style options

MinorGrid: {None} | True | False

Turn on/off minor grid lines, or leave as is if None

MinorGridStyle: {{}} | dict

Dictionary of minor grid line line style options

Ticks: {None} | False

Turn off ticks if False

XTicks: {Ticks} | None | False | list

Option for x-axis tick levels, turn off if False or []

YTicks: {Ticks} | None | False | list

Option for y-axis tick levels, turn off if False or []

TickLabels: {None} | False

Turn off tick labels if False

XTickLabels: {TickLabels} | None | False | list

Option for x-axis tick labels, turn off if False or []

YTickLabels: {TickLabels} | None | False | list

Option for y-axis tick labels, turn off if False or []

Outputs:
h: dict

Dictionary of figure/plot handles

Versions:
  • 2014-11-12 @ddalle: Version 1.0

  • 2014-12-09 @ddalle: Transferred to AeroPlot

  • 2015-02-15 @ddalle: Transferred to dataBook.Aero

  • 2015-03-04 @ddalle: Added nStart and nLast

  • 2015-12-07 @ddalle: Moved to basis class

  • 2017-10-12 @ddalle: Added grid and tick options

PlotValueHist(coeff, nAvg=100, nLast=None, **kw)

Plot a histogram of the iterative history of some value c

Call:
>>> h = FM.PlotValueHist(comp, c, n=1000, nAvg=100, **kw)
Inputs:
FM: cape.cfdx.dataBook.CaseData

Instance of the component force history class

comp: str

Name of component to plot

c: str

Name of coefficient to plot, e.g. 'CA'

nAvg: int

Use the last nAvg iterations to compute an average

nBins: {20} | int

Number of bins in histogram, also can be set in HistOptions

nLast: int

Last iteration to use (defaults to last iteration available)

Keyword Arguments:
FigWidth: float

Figure width

FigHeight: float

Figure height

Label: [ {comp} | str ]

Manually specified label

TargetValue: float | list[float]

Target or list of target values

TargetLabel: str | list (str)

Legend label(s) for target(s)

StDev: [ {None} | float ]

Multiple of iterative history standard deviation to plot

HistOptions: dict

Plot options for the primary histogram

StDevOptions: dict

Dictionary of plot options for the standard deviation plot

DeltaOptions: dict

Options passed to plt.plot() for reference range plot

MeanOptions: dict

Options passed to plt.plot() for mean line

TargetOptions: dict

Options passed to plt.plot() for target value lines

OutlierSigma: {7.0} | float

Standard deviation multiplier for determining outliers

ShowMu: bool

Option to print value of mean

ShowSigma: bool

Option to print value of standard deviation

ShowError: bool

Option to print value of sampling error

ShowDelta: bool

Option to print reference value

ShowTarget: bool

Option to show target value

MuFormat: {"%.4f"} | str

Format for text label of the mean value

DeltaFormat: {"%.4f"} | str

Format for text label of the reference value d

SigmaFormat: {"%.4f"} | str

Format for text label of the iterative standard deviation

TargetFormat: {"%.4f"} | str

Format for text label of the target value

XLabel: str

Specified label for x-axis, default is Iteration Number

YLabel: str

Specified label for y-axis, default is c

Outputs:
h: dict

Dictionary of figure/plot handles

Versions:
  • 2015-02-15 @ddalle: Version 1.0

  • 2015-03-06 @ddalle: Added nLast and fixed documentation

  • 2015-03-06 @ddalle: Copied to CaseFM

class cape.cfdx.dataBook.CaseFM(comp)

This class contains methods for reading data about an the histroy of an individual component for a single case. The list of available components comes from a loadsCC.dat file if one exists.

Call:
>>> FM = cape.cfdx.dataBook.CaseFM(C, MRP=None, A=None)
Inputs:
C: list (str)

List of coefficients to initialize

MRP: numpy.ndarray[float] shape=(3,)

Moment reference point

A: numpy.ndarray shape=(N,4) or shape=(N,7)

Matrix of forces and/or moments at N iterations

Outputs:
FM: cape.aero.FM

Instance of the force and moment class

FM.C: list (str)

List of coefficients

FM.MRP: numpy.ndarray[float] shape=(3,)

Moment reference point

FM.i: numpy.ndarray shape=(0,)

List of iteration numbers

FM.CA: numpy.ndarray shape=(0,)

Axial force coefficient at each iteration

FM.CY: numpy.ndarray shape=(0,)

Lateral force coefficient at each iteration

FM.CN: numpy.ndarray shape=(0,)

Normal force coefficient at each iteration

FM.CLL: numpy.ndarray shape=(0,)

Rolling moment coefficient at each iteration

FM.CLM: numpy.ndarray shape=(0,)

Pitching moment coefficient at each iteration

FM.CLN: numpy.ndarray shape=(0,)

Yaw moment coefficient at each iteration

Versions:
  • 2014-11-12 @ddalle: Starter version

  • 2014-12-21 @ddalle: Copied from previous aero.FM

AddData(A)

Add iterative force and/or moment history for a component

Call:
>>> FM.AddData(A)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the force and moment class

A: numpy.ndarray shape=(N,4) or shape=(N,7)

Matrix of forces and/or moments at N iterations

Versions:
  • 2014-11-12 @ddalle: Version 1.0

  • 2015-10-16 @ddalle: Version 2.0, complete rewrite

Copy()

Copy an iterative force & moment history

Call:
>>> FM2 = FM1.Copy()
Inputs:
FM1: cape.cfdx.dataBook.CaseFM

Force and moment history

Outputs:
FM2: cape.cfdx.dataBook.CaseFM

Copy of FM1

Versions:
  • 2017-03-20 @ddalle: Version 1.0

GetStats(nStats=100, nMax=None, **kw)

Get mean, min, max, and standard deviation for all coefficients

Call:
>>> s = FM.GetStats(nStats, nMax=None, nLast=None)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the force and moment class

coeff: str

Name of coefficient to process

nStats: {100} | int

Minimum number of iterations in window to use for statistics

dnStats: {nStats} | int

Interval size for candidate windows

nMax: (nStats} | int

Maximum number of iterations to use for statistics

nMin: {0} | int

First usable iteration number

nLast: {FM.i[-1]} | int

Last iteration to use for statistics

Outputs:
s: dict[float]

Dictionary of mean, min, max, std, err for each coefficient

Versions:
  • 2017-09-29 @ddalle: Version 1.0

GetStatsCoeff(coeff, nStats=100, nMax=None, **kw)

Get mean, min, max, and other statistics for one coefficient

Call:
>>> s = FM.GetStatsCoeff(coeff, nStats=100, nMax=None, **kw)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the force and moment class

coeff: str

Name of coefficient to process

nStats: {100} | int

Minimum number of iterations in window to use for statistics

dnStats: {nStats} | int

Interval size for candidate windows

nMax: (nStats} | int

Maximum number of iterations to use for statistics

nMin: {0} | int

First usable iteration number

nLast: {FM.i[-1]} | int

Last iteration to use for statistics

Outputs:
s: dict[float]

Dictionary of mean, min, max, std for coeff

Versions:
  • 2017-09-29 @ddalle: Version 1.0

GetStatsN(nStats=100, nLast=None)

Get mean, min, max, and standard deviation for all coefficients

Call:
>>> s = FM.GetStatsN(nStats, nLast=None)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the force and moment class

nStats: int

Number of iterations in window to use for statistics

nLast: int

Last iteration to use for statistics

Outputs:
s: dict[float]

Dictionary of mean, min, max, std for each coefficient

Versions:
  • 2014-12-09 @ddalle: Version 1.0

  • 2015-02-28 @ddalle: Renamed from GetStats()

  • 2015-03-04 @ddalle: Added last iteration capability

GetStatsOld(nStats=100, nMax=None, nLast=None)

Get mean, min, max, and standard deviation for all coefficients

Call:
>>> s = FM.GetStatsOld(nStats, nMax=None, nLast=None)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the force and moment class

nStats: int

Minimum number of iterations in window to use for statistics

nMax: int

Maximum number of iterations to use for statistics

nLast: int

Last iteration to use for statistics

Outputs:
s: dict[float]

Dictionary of mean, min, max, std for each coefficient

Versions:
  • 2015-02-28 @ddalle: Version 1.0

  • 2015-03-04 @ddalle: Added last iteration capability

PlotCoeff(c, n=None, **kw)

Plot a single coefficient history

Call:
>>> h = FM.PlotCoeff(c, n=1000, nAvg=100, **kw)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the component force history class

c: str

Name of coefficient to plot, e.g. 'CA'

n: int

Only show the last n iterations

nAvg: int

Use the last nAvg iterations to compute an average

d: float

Delta in the coefficient to show expected range

nLast: int

Last iteration to use (defaults to last iteration available)

nFirst: int

First iteration to plot

FigWidth: float

Figure width

FigHeight: float

Figure height

Outputs:
h: dict

Dictionary of figure/plot handles

Versions:
  • 2014-11-12 @ddalle: Version 1.0

  • 2014-12-09 @ddalle: Transferred to AeroPlot

  • 2015-02-15 @ddalle: Transferred to dataBook.Aero

  • 2015-03-04 @ddalle: Added nStart and nLast

  • 2015-12-07 @ddalle: Moved content to base class

PlotCoeffHist(c, nAvg=100, nBin=20, nLast=None, **kw)

Plot a single coefficient histogram

Call:
>>> h = FM.PlotCoeffHist(comp, c, n=1000, nAvg=100, **kw)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the component force history class

comp: str

Name of component to plot

c: str

Name of coefficient to plot, e.g. 'CA'

nAvg: int

Use the last nAvg iterations to compute an average

nBin: int

Number of bins to plot

nLast: int

Last iteration to use (defaults to last iteration available)

FigWidth: float

Figure width

FigHeight: float

Figure height

Keyword arguments:
Outputs:
h: dict

Dictionary of figure/plot handles

Versions:
  • 2015-02-15 @ddalle: Version 1.0

  • 2015-03-06 @ddalle: Added nLast and fixed documentation

  • 2015-03-06 @ddalle: Copied to CaseFM

ShiftMRP(Lref, x, xi=None)

Shift the moment reference point

Call:
>>> FM.ShiftMRP(Lref, x, xi=None)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the force and moment class

Lref: float

Reference length

x: list[float]

Target moment reference point

xi: list[float]

Current moment reference point (default: self.MRP)

Versions:
  • 2015-03-02 @ddalle: Version 1.0

TransformFM(topts, x, i)

Transform a force and moment history

Available transformations and their parameters are listed below.

  • “Euler321”: “psi”, “theta”, “phi”

  • “Euler123”: “phi”, “theta”, “psi”

  • “ScaleCoeffs”: “CA”, “CY”, “CN”, “CLL”, “CLM”, “CLN”

RunMatrix variables are used to specify values to use for the transformation variables. For example,

topts = {"Type": "Euler321",
    "psi": "Psi", "theta": "Theta", "phi": "Phi"}

will cause this function to perform a reverse Euler 3-2-1 transformation using x.Psi[i], x.Theta[i], and x.Phi[i] as the angles.

Coefficient scaling can be used to fix incorrect reference areas or flip axes. The default is actually to flip CLL and CLN due to the transformation from CFD axes to standard flight dynamics axes.

tops = {"Type": "ScaleCoeffs",
    "CLL": -1.0, "CLN": -1.0}
Call:
>>> FM.TransformFM(topts, x, i)
Inputs:
FM: cape.cfdx.dataBook.CaseFM

Instance of the force and moment class

topts: dict

Dictionary of options for the transformation

x: cape.runmatrix.RunMatrix

The run matrix used for this analysis

i: int

Run matrix case index

Versions:
  • 2014-12-22 @ddalle: Version 1.0

TrimIters()

Trim non-ascending iterations and other problems

Call:
>>> FM.TrimIters()
Versions:
  • 2017-10-02 @ddalle: Version 1.0

class cape.cfdx.dataBook.CaseResid

Iterative history class

This class provides an interface to residuals, CPU time, and similar data for a given run directory

Call:
>>> hist = cape.cfdx.dataBook.CaseResid()
Outputs:
hist: cape.cfdx.dataBook.CaseResid

Instance of the run history class

Versions:
  • 2014-11-12 @ddalle: Starter version

GetIterationIndex(i)

Return index of a particular iteration in hist.i

If the iteration i is not present in the history, the index of the last available iteration less than or equal to i is returned.

Call:
>>> j = hist.GetIterationIndex(i)
Inputs:
hist: cape.cfdx.dataBook.CaseResid

Instance of the residual history class

i: int

Iteration number

Outputs:
j: int

Index of last iteration in FM.i less than or equal to i

Versions:
  • 2015-03-06 @ddalle: Version 1.0

GetNOrders(nStats=1)

Get the number of orders of magnitude of residual drop

Call:
>>> nOrders = hist.GetNOrders(nStats=1)
Inputs:
hist: cape.cfdx.dataBook.CaseResid

Instance of the DataBook residual history

nStats: int

Number of iterations to use for averaging the final residual

Outputs:
nOrders: float

Number of orders of magnitude of residual drop

Versions:
  • 2015-01-01 @ddalle: First versoin

GetNOrdersUnsteady(n=1)

Get the number of orders of magnitude of unsteady residual drop for each of the last n unsteady iteration cycles.

Call:
>>> nOrders = hist.GetNOrders(n=1)
Inputs:
hist: cape.cfdx.dataBook.CaseResid

Instance of the DataBook residual history

n: int

Number of iterations to analyze

Outputs:
nOrders: numpy.ndarray[float], shape=(n,)

Number of orders of magnitude of unsteady residual drop

Versions:
  • 2015-01-01 @ddalle: First versoin

PlotL1(n=None, nFirst=None, nLast=None, **kw)

Plot the L1 residual

Call:
>>> h = hist.PlotL1(n=None, nFirst=None, nLast=None, **kw)
Inputs:
hist: cape.cfdx.dataBook.CaseResid

Instance of the DataBook residual history

n: int

Only show the last n iterations

nFirst: int

Plot starting at iteration nStart

nLast: int

Plot up to iteration nLast

FigWidth: float

Figure width

FigHeight: float

Figure height

Outputs:
h: dict

Dictionary of figure/plot handles

Versions:
  • 2014-11-12 @ddalle: Version 1.0

  • 2014-12-09 @ddalle: Moved to AeroPlot

  • 2015-02-15 @ddalle: Transferred to dataBook.Aero

  • 2015-03-04 @ddalle: Added nStart and nLast

  • 2015-10-21 @ddalle: Referred to PlotResid()

PlotL2(n=None, nFirst=None, nLast=None, **kw)

Plot the L2 residual

Call:
>>> h = hist.PlotL2(n=None, nFirst=None, nLast=None, **kw)
Inputs:
hist: cape.cfdx.dataBook.CaseResid

Instance of the DataBook residual history

n: int

Only show the last n iterations

nFirst: int

Plot starting at iteration nStart

nLast: int

Plot up to iteration nLast

FigWidth: float

Figure width

FigHeight: float

Figure height

Outputs:
h: dict

Dictionary of figure/plot handles

Versions:
  • 2014-11-12 @ddalle: Version 1.0

  • 2014-12-09 @ddalle: Moved to AeroPlot

  • 2015-02-15 @ddalle: Transferred to dataBook.Aero

  • 2015-03-04 @ddalle: Added nStart and nLast

  • 2015-10-21 @ddalle: Referred to PlotResid()

PlotLInf(n=None, nFirst=None, nLast=None, **kw)

Plot the L-infinity residual

Call:
>>> h = hist.PlotLInf(n=None, nFirst=None, nLast=None, **kw)
Inputs:
hist: cape.cfdx.dataBook.CaseResid

Instance of the DataBook residual history

n: int

Only show the last n iterations

nFirst: int

Plot starting at iteration nStart

nLast: int

Plot up to iteration nLast

FigWidth: float

Figure width

FigHeight: float

Figure height

Outputs:
h: dict

Dictionary of figure/plot handles

Versions:
  • 2016-02-04 @ddalle: Copied from PlotL2()

PlotResid(c='L1Resid', n=None, nFirst=None, nLast=None, **kw)

Plot a residual by name

Call:
>>> h = hist.PlotResid(c='L1Resid', n=None, **kw)
Inputs:
hist: cape.cfdx.dataBook.CaseResid

Instance of the DataBook residual history

c: str

Name of coefficient to plot

n: int

Only show the last n iterations

LineOptions: dict

Plot options for the primary line(s)

nFirst: int

Plot starting at iteration nStart

nLast: int

Plot up to iteration nLast

FigWidth: float

Figure width

FigHeight: float

Figure height

YLabel: str

Label for y-axis

Outputs:
h: dict

Dictionary of figure/plot handles

Versions:
  • 2014-11-12 @ddalle: Version 1.0

  • 2014-12-09 @ddalle: Moved to AeroPlot

  • 2015-02-15 @ddalle: Transferred to dataBook.Aero

  • 2015-03-04 @ddalle: Added nStart and nLast

  • 2015-10-21 @ddalle: Copied from PlotL1()

  • 2022-01-28 @ddalle: Added xcol

Other cape.cfdx.dataBook methods

cape.cfdx.dataBook.ImportPyPlot()

Import matplotlib.pyplot if not already loaded

Call:
>>> ImportPyPlot()
Versions:
  • 2014-12-27 @ddalle: Version 1.0

cape.cfdx.dataBook.get_xlim(ha, pad=0.05)

Calculate appropriate x-limits to include all lines in a plot

Plotted objects in the classes matplotlib.lines.Lines2D are checked.

Call:
>>> xmin, xmax = get_xlim(ha, pad=0.05)
Inputs:
ha: matplotlib.axes.AxesSubplot

Axis handle

pad: float

Extra padding to min and max values to plot.

Outputs:
xmin: float

Minimum x coordinate including padding

xmax: float

Maximum x coordinate including padding

Versions:
  • 2015-07-06 @ddalle: Version 1.0

cape.cfdx.dataBook.get_ylim(ha, pad=0.05)

Calculate appropriate y-limits to include all lines in a plot

Plotted objects in the classes matplotlib.lines.Lines2D and matplotlib.collections.PolyCollection are checked.

Call:
>>> ymin, ymax = get_ylim(ha, pad=0.05)
Inputs:
ha: matplotlib.axes.AxesSubplot

Axis handle

pad: float

Extra padding to min and max values to plot.

Outputs:
ymin: float

Minimum y coordinate including padding

ymax: float

Maximum y coordinate including padding

Versions:
  • 2015-07-06 @ddalle: Version 1.0