mirror of
https://github.com/natelandau/obsidian-metadata.git
synced 2025-11-17 01:13:39 -05:00
feat: export metadata (#14)
* docs(readme): fix line breaks * feat: export metadata to a CSV * fix: finalize colors for questions * feat: inspect frontmatter, inline, and tags separately * feat: export metadata to JSON * fix: do not count in-page links as tags * ci(codecov): adjust patch target percentage down * feat(metadata): export CSV or JSON from command line
This commit is contained in:
@@ -61,6 +61,7 @@
|
|||||||
"foxundermoon.shell-format",
|
"foxundermoon.shell-format",
|
||||||
"GitHub.copilot",
|
"GitHub.copilot",
|
||||||
"Gruntfuggly.todo-tree",
|
"Gruntfuggly.todo-tree",
|
||||||
|
"GrapeCity.gc-excelviewer",
|
||||||
"mhutchie.git-graph",
|
"mhutchie.git-graph",
|
||||||
"njpwerner.autodocstring",
|
"njpwerner.autodocstring",
|
||||||
"oderwat.indent-rainbow",
|
"oderwat.indent-rainbow",
|
||||||
|
|||||||
@@ -61,10 +61,10 @@ repos:
|
|||||||
entry: yamllint --strict --config-file .yamllint.yml
|
entry: yamllint --strict --config-file .yamllint.yml
|
||||||
|
|
||||||
- repo: "https://github.com/charliermarsh/ruff-pre-commit"
|
- repo: "https://github.com/charliermarsh/ruff-pre-commit"
|
||||||
rev: "v0.0.237"
|
rev: "v0.0.239"
|
||||||
hooks:
|
hooks:
|
||||||
- id: ruff
|
- id: ruff
|
||||||
args: ["--extend-ignore", "I001,D301,D401,PLR2004"]
|
args: ["--extend-ignore", "I001,D301,D401,PLR2004,PLR0913"]
|
||||||
|
|
||||||
- repo: "https://github.com/jendrikseipp/vulture"
|
- repo: "https://github.com/jendrikseipp/vulture"
|
||||||
rev: "v2.7"
|
rev: "v2.7"
|
||||||
|
|||||||
60
README.md
60
README.md
@@ -1,15 +1,18 @@
|
|||||||
[](https://badge.fury.io/py/obsidian-metadata)  [](https://github.com/natelandau/obsidian-metadata/actions/workflows/python-code-checker.yml) [](https://codecov.io/gh/natelandau/obsidian-metadata)
|
[](https://badge.fury.io/py/obsidian-metadata)  [](https://github.com/natelandau/obsidian-metadata/actions/workflows/python-code-checker.yml) [](https://codecov.io/gh/natelandau/obsidian-metadata)
|
||||||
|
|
||||||
# obsidian-metadata
|
# obsidian-metadata
|
||||||
|
|
||||||
A script to make batch updates to metadata in an Obsidian vault. No changes are
|
A script to make batch updates to metadata in an Obsidian vault. No changes are
|
||||||
made to the Vault until they are explicitly committed.
|
made to the Vault until they are explicitly committed.
|
||||||
|
|
||||||
[](https://asciinema.org/a/555789)
|
[](https://asciinema.org/a/555789)
|
||||||
|
|
||||||
## Important Disclaimer
|
## Important Disclaimer
|
||||||
|
|
||||||
**It is strongly recommended that you back up your vault prior to committing changes.** This script makes changes directly to the markdown files in your vault. Once the changes are committed, there is no ability to recreate the original information unless you have a backup. Follow the instructions in the script to create a backup of your vault if needed. The author of this script is not responsible for any data loss that may occur. Use at your own risk.
|
**It is strongly recommended that you back up your vault prior to committing changes.** This script makes changes directly to the markdown files in your vault. Once the changes are committed, there is no ability to recreate the original information unless you have a backup. Follow the instructions in the script to create a backup of your vault if needed. The author of this script is not responsible for any data loss that may occur. Use at your own risk.
|
||||||
|
|
||||||
|
|
||||||
## Install
|
## Install
|
||||||
|
|
||||||
Requires Python v3.10 or above.
|
Requires Python v3.10 or above.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
@@ -17,54 +20,74 @@ pip install obsidian-metadata
|
|||||||
```
|
```
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
Run `obsidian-metadata` from the command line to invoke the script. Add `--help` to view additional options.
|
|
||||||
|
|
||||||
Obsidian-metadata provides a menu of sub-commands.
|
### CLI Commands
|
||||||
|
|
||||||
|
- `--config-file`: Specify a custom configuration file location
|
||||||
|
- `--dry-run`: Make no destructive changes
|
||||||
|
- `--export-csv`: Specify a path and create a CSV export of all metadata
|
||||||
|
- `--export-json`: Specify a path and create a JSON export of all metadata
|
||||||
|
- `--help`: Shows interactive help and exits
|
||||||
|
- `--log-file`: Specify a log file location
|
||||||
|
- `--log-to-file`: Will log to a file
|
||||||
|
- `--vault-path`: Specify a path to an Obsidian Vault
|
||||||
|
- `--verbose`: Set verbosity level (0=WARN, 1=INFO, 2=DEBUG, 3=TRACE)
|
||||||
|
- `--version`: Prints the version number and exits
|
||||||
|
|
||||||
|
### Running the script
|
||||||
|
|
||||||
|
Once installed, run `obsidian-metadata` in your terminal to enter an interactive menu of sub-commands.
|
||||||
|
|
||||||
**Vault Actions**
|
**Vault Actions**
|
||||||
Create or delete a backup of your vault.
|
|
||||||
- Backup: Create a backup of the vault.
|
- Backup: Create a backup of the vault.
|
||||||
- Delete Backup: Delete a backup of the vault.
|
- Delete Backup: Delete a backup of the vault.
|
||||||
|
|
||||||
**Inspect Metadata**
|
**Inspect Metadata**
|
||||||
Inspect the metadata in your vault.
|
|
||||||
- View all metadata in the vault
|
|
||||||
|
|
||||||
**Filter Notes in Scope**:
|
- View all metadata in the vault
|
||||||
Limit the scope of notes to be processed with one or more filters.
|
- View all metadata in the vault
|
||||||
|
- View all frontmatter
|
||||||
|
- View all inline metadata
|
||||||
|
- View all inline tags
|
||||||
|
- Export all metadata to CSV or JSON file
|
||||||
|
|
||||||
|
**Filter Notes in Scope**: Limit the scope of notes to be processed with one or more filters.
|
||||||
|
|
||||||
- Path filter (regex): Limit scope based on the path or filename
|
- Path filter (regex): Limit scope based on the path or filename
|
||||||
- Metadata Filter: Limit scope based on a key or key/value pair
|
- Metadata Filter: Limit scope based on a key or key/value pair
|
||||||
- Tag Filter: Limit scope based on an in-text tag
|
- Tag Filter: Limit scope based on an in-text tag
|
||||||
- List and Clear Filters List all current filters and clear one or all
|
- List and Clear Filters List all current filters and clear one or all
|
||||||
- List notes in scope: List notes that will be processed.
|
- List notes in scope: List notes that will be processed.
|
||||||
|
|
||||||
**Add Metadata**
|
**Add Metadata**: Add new metadata to your vault.
|
||||||
Add new metadata to your vault.
|
|
||||||
- Add metadata to the frontmatter
|
- Add metadata to the frontmatter
|
||||||
- Add to inline metadata (Not yet implemented)
|
- Add to inline metadata (Not yet implemented)
|
||||||
- Add to inline tag (Not yet implemented)
|
- Add to inline tag (Not yet implemented)
|
||||||
|
|
||||||
**Rename Metadata**
|
**Rename Metadata**: Rename either a key and all associated values, a specific value within a key. or an in-text tag.
|
||||||
Rename either a key and all associated values, a specific value within a key. or an in-text tag.
|
|
||||||
- Rename a key
|
- Rename a key
|
||||||
- Rename a value
|
- Rename a value
|
||||||
- rename an inline tag
|
- rename an inline tag
|
||||||
|
|
||||||
**Delete Metadata**
|
**Delete Metadata**: Delete either a key and all associated values, or a specific value.
|
||||||
Delete either a key and all associated values, or a specific value.
|
|
||||||
- Delete a key and associated values
|
- Delete a key and associated values
|
||||||
- Delete a value from a key
|
- Delete a value from a key
|
||||||
- Delete an inline tag
|
- Delete an inline tag
|
||||||
|
|
||||||
**Review Changes**
|
**Review Changes**: Prior to committing changes, review all changes that will be made.
|
||||||
Prior to committing changes, review all changes that will be made.
|
|
||||||
- View a diff of the changes that will be made
|
- View a diff of the changes that will be made
|
||||||
|
|
||||||
**Commit Changes**
|
**Commit Changes**: Write the changes to disk. This step is not undoable.
|
||||||
Write the changes to disk. This step is not undoable.
|
|
||||||
- Commit changes to the vault
|
- Commit changes to the vault
|
||||||
|
|
||||||
### Configuration
|
### Configuration
|
||||||
|
|
||||||
`obsidian-metadata` requires a configuration file at `~/.obsidian_metadata.toml`. On first run, this file will be created. You can specify a new location for the configuration file with the `--config-file` option.
|
`obsidian-metadata` requires a configuration file at `~/.obsidian_metadata.toml`. On first run, this file will be created. You can specify a new location for the configuration file with the `--config-file` option.
|
||||||
|
|
||||||
To add additional vaults, copy the default section and add the appropriate information. The script will prompt you to select a vault if multiple exist in the configuration file
|
To add additional vaults, copy the default section and add the appropriate information. The script will prompt you to select a vault if multiple exist in the configuration file
|
||||||
@@ -87,7 +110,6 @@ Below is an example with two vaults.
|
|||||||
|
|
||||||
To bypass the configuration file and specify a vault to use at runtime use the `--vault-path` option.
|
To bypass the configuration file and specify a vault to use at runtime use the `--vault-path` option.
|
||||||
|
|
||||||
|
|
||||||
# Contributing
|
# Contributing
|
||||||
|
|
||||||
## Setup: Once per project
|
## Setup: Once per project
|
||||||
|
|||||||
@@ -4,8 +4,11 @@ coverage:
|
|||||||
project:
|
project:
|
||||||
default:
|
default:
|
||||||
target: 50% # the required coverage value
|
target: 50% # the required coverage value
|
||||||
threshold: 1% # the leniency in hitting the target
|
threshold: 5% # the leniency in hitting the target
|
||||||
|
patch:
|
||||||
|
default:
|
||||||
|
target: 50%
|
||||||
|
threshold: 5%
|
||||||
ignore:
|
ignore:
|
||||||
- tests/
|
- tests/
|
||||||
|
|
||||||
|
|||||||
1740
poetry.lock
generated
1740
poetry.lock
generated
File diff suppressed because it is too large
Load Diff
@@ -20,6 +20,7 @@
|
|||||||
loguru = "^0.6.0"
|
loguru = "^0.6.0"
|
||||||
python = "^3.10"
|
python = "^3.10"
|
||||||
questionary = "^1.10.0"
|
questionary = "^1.10.0"
|
||||||
|
regex = "^2022.10.31"
|
||||||
rich = "^13.2.0"
|
rich = "^13.2.0"
|
||||||
ruamel-yaml = "^0.17.21"
|
ruamel-yaml = "^0.17.21"
|
||||||
shellingham = "^1.4.0"
|
shellingham = "^1.4.0"
|
||||||
@@ -206,7 +207,7 @@
|
|||||||
help = "Lint this package"
|
help = "Lint this package"
|
||||||
|
|
||||||
[[tool.poe.tasks.lint.sequence]]
|
[[tool.poe.tasks.lint.sequence]]
|
||||||
shell = "ruff --extend-ignore=I001,D301 src/ tests/"
|
shell = "ruff --extend-ignore=I001,D301,D401,PLR2004,PLR0913 src/ tests/"
|
||||||
|
|
||||||
[[tool.poe.tasks.lint.sequence]]
|
[[tool.poe.tasks.lint.sequence]]
|
||||||
shell = "black --check src/ tests/"
|
shell = "black --check src/ tests/"
|
||||||
|
|||||||
@@ -8,6 +8,7 @@ from obsidian_metadata._utils.utilities import (
|
|||||||
dict_contains,
|
dict_contains,
|
||||||
dict_values_to_lists_strings,
|
dict_values_to_lists_strings,
|
||||||
docstring_parameter,
|
docstring_parameter,
|
||||||
|
merge_dictionaries,
|
||||||
remove_markdown_sections,
|
remove_markdown_sections,
|
||||||
version_callback,
|
version_callback,
|
||||||
)
|
)
|
||||||
@@ -20,6 +21,7 @@ __all__ = [
|
|||||||
"dict_values_to_lists_strings",
|
"dict_values_to_lists_strings",
|
||||||
"docstring_parameter",
|
"docstring_parameter",
|
||||||
"LoggerManager",
|
"LoggerManager",
|
||||||
|
"merge_dictionaries",
|
||||||
"remove_markdown_sections",
|
"remove_markdown_sections",
|
||||||
"vault_validation",
|
"vault_validation",
|
||||||
"version_callback",
|
"version_callback",
|
||||||
|
|||||||
@@ -8,101 +8,6 @@ import typer
|
|||||||
from obsidian_metadata.__version__ import __version__
|
from obsidian_metadata.__version__ import __version__
|
||||||
|
|
||||||
|
|
||||||
def dict_values_to_lists_strings(dictionary: dict, strip_null_values: bool = False) -> dict:
|
|
||||||
"""Converts all values in a dictionary to lists of strings.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
dictionary (dict): Dictionary to convert
|
|
||||||
strip_null (bool): Whether to strip null values
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
dict: Dictionary with all values converted to lists of strings
|
|
||||||
|
|
||||||
{key: sorted(new_dict[key]) for key in sorted(new_dict)}
|
|
||||||
"""
|
|
||||||
new_dict = {}
|
|
||||||
|
|
||||||
if strip_null_values:
|
|
||||||
for key, value in dictionary.items():
|
|
||||||
if isinstance(value, list):
|
|
||||||
new_dict[key] = sorted([str(item) for item in value if item is not None])
|
|
||||||
elif isinstance(value, dict):
|
|
||||||
new_dict[key] = dict_values_to_lists_strings(value) # type: ignore[assignment]
|
|
||||||
elif value is None or value == "None" or value == "":
|
|
||||||
new_dict[key] = []
|
|
||||||
else:
|
|
||||||
new_dict[key] = [str(value)]
|
|
||||||
|
|
||||||
return new_dict
|
|
||||||
|
|
||||||
for key, value in dictionary.items():
|
|
||||||
if isinstance(value, list):
|
|
||||||
new_dict[key] = sorted([str(item) for item in value])
|
|
||||||
elif isinstance(value, dict):
|
|
||||||
new_dict[key] = dict_values_to_lists_strings(value) # type: ignore[assignment]
|
|
||||||
else:
|
|
||||||
new_dict[key] = [str(value)]
|
|
||||||
|
|
||||||
return new_dict
|
|
||||||
|
|
||||||
|
|
||||||
def remove_markdown_sections(
|
|
||||||
text: str,
|
|
||||||
strip_codeblocks: bool = False,
|
|
||||||
strip_inlinecode: bool = False,
|
|
||||||
strip_frontmatter: bool = False,
|
|
||||||
) -> str:
|
|
||||||
"""Strip markdown sections from text.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
text (str): Text to remove code blocks from
|
|
||||||
strip_codeblocks (bool, optional): Strip code blocks. Defaults to False.
|
|
||||||
strip_inlinecode (bool, optional): Strip inline code. Defaults to False.
|
|
||||||
strip_frontmatter (bool, optional): Strip frontmatter. Defaults to False.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
str: Text without code blocks
|
|
||||||
"""
|
|
||||||
if strip_codeblocks:
|
|
||||||
text = re.sub(r"`{3}.*?`{3}", "", text, flags=re.DOTALL)
|
|
||||||
|
|
||||||
if strip_inlinecode:
|
|
||||||
text = re.sub(r"`.*?`", "", text)
|
|
||||||
|
|
||||||
if strip_frontmatter:
|
|
||||||
text = re.sub(r"^\s*---.*?---", "", text, flags=re.DOTALL)
|
|
||||||
|
|
||||||
return text # noqa: RET504
|
|
||||||
|
|
||||||
|
|
||||||
def version_callback(value: bool) -> None:
|
|
||||||
"""Print version and exit."""
|
|
||||||
if value:
|
|
||||||
print(f"{__package__.split('.')[0]}: v{__version__}")
|
|
||||||
raise typer.Exit()
|
|
||||||
|
|
||||||
|
|
||||||
def docstring_parameter(*sub: Any) -> Any:
|
|
||||||
"""Decorator to replace variables within docstrings.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
sub (Any): Replacement variables
|
|
||||||
|
|
||||||
Usage:
|
|
||||||
@docstring_parameter("foo", "bar")
|
|
||||||
def foo():
|
|
||||||
'''This is a {0} docstring with {1} variables.'''
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
def dec(obj: Any) -> Any:
|
|
||||||
"""Format object."""
|
|
||||||
obj.__doc__ = obj.__doc__.format(*sub)
|
|
||||||
return obj
|
|
||||||
|
|
||||||
return dec
|
|
||||||
|
|
||||||
|
|
||||||
def clean_dictionary(dictionary: dict[str, Any]) -> dict[str, Any]:
|
def clean_dictionary(dictionary: dict[str, Any]) -> dict[str, Any]:
|
||||||
"""Clean up a dictionary by markdown formatting from keys and values.
|
"""Clean up a dictionary by markdown formatting from keys and values.
|
||||||
|
|
||||||
@@ -155,3 +60,126 @@ def dict_contains(
|
|||||||
return any(found_keys)
|
return any(found_keys)
|
||||||
|
|
||||||
return key in dictionary and value in dictionary[key]
|
return key in dictionary and value in dictionary[key]
|
||||||
|
|
||||||
|
|
||||||
|
def dict_values_to_lists_strings(dictionary: dict, strip_null_values: bool = False) -> dict:
|
||||||
|
"""Converts all values in a dictionary to lists of strings.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dictionary (dict): Dictionary to convert
|
||||||
|
strip_null (bool): Whether to strip null values
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: Dictionary with all values converted to lists of strings
|
||||||
|
|
||||||
|
{key: sorted(new_dict[key]) for key in sorted(new_dict)}
|
||||||
|
"""
|
||||||
|
new_dict = {}
|
||||||
|
|
||||||
|
if strip_null_values:
|
||||||
|
for key, value in dictionary.items():
|
||||||
|
if isinstance(value, list):
|
||||||
|
new_dict[key] = sorted([str(item) for item in value if item is not None])
|
||||||
|
elif isinstance(value, dict):
|
||||||
|
new_dict[key] = dict_values_to_lists_strings(value) # type: ignore[assignment]
|
||||||
|
elif value is None or value == "None" or value == "":
|
||||||
|
new_dict[key] = []
|
||||||
|
else:
|
||||||
|
new_dict[key] = [str(value)]
|
||||||
|
|
||||||
|
return new_dict
|
||||||
|
|
||||||
|
for key, value in dictionary.items():
|
||||||
|
if isinstance(value, list):
|
||||||
|
new_dict[key] = sorted([str(item) for item in value])
|
||||||
|
elif isinstance(value, dict):
|
||||||
|
new_dict[key] = dict_values_to_lists_strings(value) # type: ignore[assignment]
|
||||||
|
else:
|
||||||
|
new_dict[key] = [str(value)]
|
||||||
|
|
||||||
|
return new_dict
|
||||||
|
|
||||||
|
|
||||||
|
def docstring_parameter(*sub: Any) -> Any:
|
||||||
|
"""Decorator to replace variables within docstrings.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
sub (Any): Replacement variables
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
@docstring_parameter("foo", "bar")
|
||||||
|
def foo():
|
||||||
|
'''This is a {0} docstring with {1} variables.'''
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
def dec(obj: Any) -> Any:
|
||||||
|
"""Format object."""
|
||||||
|
obj.__doc__ = obj.__doc__.format(*sub)
|
||||||
|
return obj
|
||||||
|
|
||||||
|
return dec
|
||||||
|
|
||||||
|
|
||||||
|
def merge_dictionaries(dict1: dict, dict2: dict) -> dict:
|
||||||
|
"""Merge two dictionaries.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dict1 (dict): First dictionary.
|
||||||
|
dict2 (dict): Second dictionary.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: Merged dictionary.
|
||||||
|
"""
|
||||||
|
for k, v in dict2.items():
|
||||||
|
if k in dict1:
|
||||||
|
if isinstance(v, list):
|
||||||
|
dict1[k].extend(v)
|
||||||
|
else:
|
||||||
|
dict1[k] = v
|
||||||
|
|
||||||
|
for k, v in dict1.items():
|
||||||
|
if isinstance(v, list):
|
||||||
|
dict1[k] = sorted(set(v))
|
||||||
|
elif isinstance(v, dict): # pragma: no cover
|
||||||
|
for kk, vv in v.items():
|
||||||
|
if isinstance(vv, list):
|
||||||
|
v[kk] = sorted(set(vv))
|
||||||
|
|
||||||
|
return dict(sorted(dict1.items()))
|
||||||
|
|
||||||
|
|
||||||
|
def remove_markdown_sections(
|
||||||
|
text: str,
|
||||||
|
strip_codeblocks: bool = False,
|
||||||
|
strip_inlinecode: bool = False,
|
||||||
|
strip_frontmatter: bool = False,
|
||||||
|
) -> str:
|
||||||
|
"""Strip markdown sections from text.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
text (str): Text to remove code blocks from
|
||||||
|
strip_codeblocks (bool, optional): Strip code blocks. Defaults to False.
|
||||||
|
strip_inlinecode (bool, optional): Strip inline code. Defaults to False.
|
||||||
|
strip_frontmatter (bool, optional): Strip frontmatter. Defaults to False.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
str: Text without code blocks
|
||||||
|
"""
|
||||||
|
if strip_codeblocks:
|
||||||
|
text = re.sub(r"`{3}.*?`{3}", "", text, flags=re.DOTALL)
|
||||||
|
|
||||||
|
if strip_inlinecode:
|
||||||
|
text = re.sub(r"`.*?`", "", text)
|
||||||
|
|
||||||
|
if strip_frontmatter:
|
||||||
|
text = re.sub(r"^\s*---.*?---", "", text, flags=re.DOTALL)
|
||||||
|
|
||||||
|
return text # noqa: RET504
|
||||||
|
|
||||||
|
|
||||||
|
def version_callback(value: bool) -> None:
|
||||||
|
"""Print version and exit."""
|
||||||
|
if value:
|
||||||
|
print(f"{__package__.split('.')[0]}: v{__version__}")
|
||||||
|
raise typer.Exit()
|
||||||
|
|||||||
@@ -28,16 +28,30 @@ HELP_TEXT = """
|
|||||||
@app.command()
|
@app.command()
|
||||||
@docstring_parameter(__package__)
|
@docstring_parameter(__package__)
|
||||||
def main(
|
def main(
|
||||||
vault_path: Path = typer.Option(
|
|
||||||
None,
|
|
||||||
help="Path to Obsidian vault",
|
|
||||||
show_default=False,
|
|
||||||
),
|
|
||||||
config_file: Path = typer.Option(
|
config_file: Path = typer.Option(
|
||||||
Path(Path.home() / f".{__package__}.toml"),
|
Path(Path.home() / f".{__package__}.toml"),
|
||||||
help="Specify a custom path to a configuration file",
|
help="Specify a custom path to a configuration file",
|
||||||
show_default=False,
|
show_default=False,
|
||||||
),
|
),
|
||||||
|
export_csv: Path = typer.Option(
|
||||||
|
None,
|
||||||
|
help="Exports all metadata to a specified CSV file and exits. (Will overwrite any existing file)",
|
||||||
|
show_default=False,
|
||||||
|
dir_okay=False,
|
||||||
|
file_okay=True,
|
||||||
|
),
|
||||||
|
export_json: Path = typer.Option(
|
||||||
|
None,
|
||||||
|
help="Exports all metadata to a specified JSON file and exits. (Will overwrite any existing file)",
|
||||||
|
show_default=False,
|
||||||
|
dir_okay=False,
|
||||||
|
file_okay=True,
|
||||||
|
),
|
||||||
|
vault_path: Path = typer.Option(
|
||||||
|
None,
|
||||||
|
help="Path to Obsidian vault",
|
||||||
|
show_default=False,
|
||||||
|
),
|
||||||
dry_run: bool = typer.Option(
|
dry_run: bool = typer.Option(
|
||||||
False,
|
False,
|
||||||
"--dry-run",
|
"--dry-run",
|
||||||
@@ -89,6 +103,10 @@ def main(
|
|||||||
[bold underline]Inspect Metadata[/]
|
[bold underline]Inspect Metadata[/]
|
||||||
Inspect the metadata in your vault.
|
Inspect the metadata in your vault.
|
||||||
• View all metadata in the vault
|
• View all metadata in the vault
|
||||||
|
• View all frontmatter
|
||||||
|
• View all inline metadata
|
||||||
|
• View all inline tags
|
||||||
|
• Export all metadata to CSV or JSON file
|
||||||
|
|
||||||
[bold underline]Filter Notes in Scope[/]
|
[bold underline]Filter Notes in Scope[/]
|
||||||
Limit the scope of notes to be processed with one or more filters.
|
Limit the scope of notes to be processed with one or more filters.
|
||||||
@@ -165,6 +183,15 @@ def main(
|
|||||||
vault_to_use = next(vault for vault in config.vaults if vault.name == vault_name)
|
vault_to_use = next(vault for vault in config.vaults if vault.name == vault_name)
|
||||||
application = Application(dry_run=dry_run, config=vault_to_use)
|
application = Application(dry_run=dry_run, config=vault_to_use)
|
||||||
|
|
||||||
|
if export_json is not None:
|
||||||
|
path = Path(export_json).expanduser().resolve()
|
||||||
|
application.noninteractive_export_json(path)
|
||||||
|
raise typer.Exit(code=0)
|
||||||
|
if export_csv is not None:
|
||||||
|
path = Path(export_json).expanduser().resolve()
|
||||||
|
application.noninteractive_export_csv(path)
|
||||||
|
raise typer.Exit(code=0)
|
||||||
|
|
||||||
application.application_main()
|
application.application_main()
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
|
|
||||||
from typing import Any
|
from typing import Any
|
||||||
|
from pathlib import Path
|
||||||
import questionary
|
import questionary
|
||||||
from rich import print
|
from rich import print
|
||||||
from rich import box
|
from rich import box
|
||||||
@@ -55,11 +55,7 @@ class Application:
|
|||||||
case "review_changes":
|
case "review_changes":
|
||||||
self.review_changes()
|
self.review_changes()
|
||||||
case "commit_changes":
|
case "commit_changes":
|
||||||
if self.commit_changes():
|
self.commit_changes()
|
||||||
break
|
|
||||||
log.error("Commit failed. Please run with -vvv for more info.")
|
|
||||||
break
|
|
||||||
|
|
||||||
case _:
|
case _:
|
||||||
break
|
break
|
||||||
|
|
||||||
@@ -221,13 +217,50 @@ class Application:
|
|||||||
|
|
||||||
choices = [
|
choices = [
|
||||||
{"name": "View all metadata", "value": "all_metadata"},
|
{"name": "View all metadata", "value": "all_metadata"},
|
||||||
|
{"name": "View all frontmatter", "value": "all_frontmatter"},
|
||||||
|
{"name": "View all inline_metadata", "value": "all_inline"},
|
||||||
|
{"name": "View all keys", "value": "all_keys"},
|
||||||
|
{"name": "View all inline tags", "value": "all_tags"},
|
||||||
|
questionary.Separator(),
|
||||||
|
{"name": "Write all metadata to CSV", "value": "export_csv"},
|
||||||
|
{"name": "Write all metadata to JSON file", "value": "export_json"},
|
||||||
questionary.Separator(),
|
questionary.Separator(),
|
||||||
{"name": "Back", "value": "back"},
|
{"name": "Back", "value": "back"},
|
||||||
]
|
]
|
||||||
while True:
|
while True:
|
||||||
match self.questions.ask_selection(choices=choices, question="Select a vault action"):
|
match self.questions.ask_selection(choices=choices, question="Select a vault action"):
|
||||||
case "all_metadata":
|
case "all_metadata":
|
||||||
self.vault.metadata.print_metadata()
|
print("")
|
||||||
|
self.vault.metadata.print_metadata(area=MetadataType.ALL)
|
||||||
|
print("")
|
||||||
|
case "all_frontmatter":
|
||||||
|
print("")
|
||||||
|
self.vault.metadata.print_metadata(area=MetadataType.FRONTMATTER)
|
||||||
|
print("")
|
||||||
|
case "all_inline":
|
||||||
|
print("")
|
||||||
|
self.vault.metadata.print_metadata(area=MetadataType.INLINE)
|
||||||
|
print("")
|
||||||
|
case "all_keys":
|
||||||
|
print("")
|
||||||
|
self.vault.metadata.print_metadata(area=MetadataType.KEYS)
|
||||||
|
print("")
|
||||||
|
case "all_tags":
|
||||||
|
print("")
|
||||||
|
self.vault.metadata.print_metadata(area=MetadataType.TAGS)
|
||||||
|
print("")
|
||||||
|
case "export_csv":
|
||||||
|
path = self.questions.ask_path(question="Enter a path for the CSV file")
|
||||||
|
if path is None:
|
||||||
|
return
|
||||||
|
self.vault.export_metadata(path=path, format="csv")
|
||||||
|
alerts.success(f"Metadata written to {path}")
|
||||||
|
case "export_json":
|
||||||
|
path = self.questions.ask_path(question="Enter a path for the JSON file")
|
||||||
|
if path is None:
|
||||||
|
return
|
||||||
|
self.vault.export_metadata(path=path, format="json")
|
||||||
|
alerts.success(f"Metadata written to {path}")
|
||||||
case _:
|
case _:
|
||||||
return
|
return
|
||||||
|
|
||||||
@@ -316,12 +349,13 @@ class Application:
|
|||||||
self.vault.backup()
|
self.vault.backup()
|
||||||
|
|
||||||
if questionary.confirm(f"Commit {len(changed_notes)} changed files to disk?").ask():
|
if questionary.confirm(f"Commit {len(changed_notes)} changed files to disk?").ask():
|
||||||
|
self.vault.commit_changes()
|
||||||
|
|
||||||
self.vault.write()
|
if not self.dry_run:
|
||||||
alerts.success(f"{len(changed_notes)} changes committed to disk. Exiting")
|
alerts.success(f"{len(changed_notes)} changes committed to disk. Exiting")
|
||||||
return True
|
return True
|
||||||
|
|
||||||
return False
|
return True
|
||||||
|
|
||||||
def delete_inline_tag(self) -> None:
|
def delete_inline_tag(self) -> None:
|
||||||
"""Delete an inline tag."""
|
"""Delete an inline tag."""
|
||||||
@@ -389,6 +423,18 @@ class Application:
|
|||||||
)
|
)
|
||||||
self.questions = Questions(vault=self.vault)
|
self.questions = Questions(vault=self.vault)
|
||||||
|
|
||||||
|
def noninteractive_export_csv(self, path: Path) -> None:
|
||||||
|
"""Export the vault metadata to CSV."""
|
||||||
|
self._load_vault()
|
||||||
|
self.vault.export_metadata(format="json", path=str(path))
|
||||||
|
alerts.success(f"Exported metadata to {path}")
|
||||||
|
|
||||||
|
def noninteractive_export_json(self, path: Path) -> None:
|
||||||
|
"""Export the vault metadata to JSON."""
|
||||||
|
self._load_vault()
|
||||||
|
self.vault.export_metadata(format="json", path=str(path))
|
||||||
|
alerts.success(f"Exported metadata to {path}")
|
||||||
|
|
||||||
def rename_key(self) -> None:
|
def rename_key(self) -> None:
|
||||||
"""Renames a key in the vault."""
|
"""Renames a key in the vault."""
|
||||||
|
|
||||||
|
|||||||
@@ -9,3 +9,5 @@ class MetadataType(Enum):
|
|||||||
FRONTMATTER = "Frontmatter"
|
FRONTMATTER = "Frontmatter"
|
||||||
INLINE = "Inline Metadata"
|
INLINE = "Inline Metadata"
|
||||||
TAGS = "Inline Tags"
|
TAGS = "Inline Tags"
|
||||||
|
KEYS = "Metadata Keys Only"
|
||||||
|
ALL = "All Metadata"
|
||||||
|
|||||||
@@ -13,12 +13,14 @@ from obsidian_metadata._utils import (
|
|||||||
clean_dictionary,
|
clean_dictionary,
|
||||||
dict_contains,
|
dict_contains,
|
||||||
dict_values_to_lists_strings,
|
dict_values_to_lists_strings,
|
||||||
|
merge_dictionaries,
|
||||||
remove_markdown_sections,
|
remove_markdown_sections,
|
||||||
)
|
)
|
||||||
from obsidian_metadata.models import Patterns # isort: ignore
|
from obsidian_metadata.models import Patterns # isort: ignore
|
||||||
|
from obsidian_metadata.models.enums import MetadataType
|
||||||
|
|
||||||
PATTERNS = Patterns()
|
PATTERNS = Patterns()
|
||||||
INLINE_TAG_KEY: str = "Inline Tags"
|
INLINE_TAG_KEY: str = "inline_tag"
|
||||||
|
|
||||||
|
|
||||||
class VaultMetadata:
|
class VaultMetadata:
|
||||||
@@ -26,50 +28,83 @@ class VaultMetadata:
|
|||||||
|
|
||||||
def __init__(self) -> None:
|
def __init__(self) -> None:
|
||||||
self.dict: dict[str, list[str]] = {}
|
self.dict: dict[str, list[str]] = {}
|
||||||
|
self.frontmatter: dict[str, list[str]] = {}
|
||||||
|
self.inline_metadata: dict[str, list[str]] = {}
|
||||||
|
self.tags: list[str] = []
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
def __repr__(self) -> str:
|
||||||
"""Representation of all metadata."""
|
"""Representation of all metadata."""
|
||||||
return str(self.dict)
|
return str(self.dict)
|
||||||
|
|
||||||
def index_metadata(self, metadata: dict[str, list[str]]) -> None:
|
def index_metadata(
|
||||||
|
self, area: MetadataType, metadata: dict[str, list[str]] | list[str]
|
||||||
|
) -> None:
|
||||||
"""Index pre-existing metadata in the vault. Takes a dictionary as input and merges it with the existing metadata. Does not overwrite existing keys.
|
"""Index pre-existing metadata in the vault. Takes a dictionary as input and merges it with the existing metadata. Does not overwrite existing keys.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
|
area (MetadataType): Type of metadata.
|
||||||
metadata (dict): Metadata to add.
|
metadata (dict): Metadata to add.
|
||||||
"""
|
"""
|
||||||
existing_metadata = self.dict
|
if isinstance(metadata, dict):
|
||||||
|
|
||||||
new_metadata = clean_dictionary(metadata)
|
new_metadata = clean_dictionary(metadata)
|
||||||
|
self.dict = merge_dictionaries(self.dict.copy(), new_metadata.copy())
|
||||||
|
|
||||||
for k, v in new_metadata.items():
|
if area == MetadataType.FRONTMATTER:
|
||||||
if k in existing_metadata:
|
self.frontmatter = merge_dictionaries(self.frontmatter.copy(), new_metadata.copy())
|
||||||
if isinstance(v, list):
|
|
||||||
existing_metadata[k].extend(v)
|
|
||||||
else:
|
|
||||||
existing_metadata[k] = v
|
|
||||||
|
|
||||||
for k, v in existing_metadata.items():
|
if area == MetadataType.INLINE:
|
||||||
if isinstance(v, list):
|
self.inline_metadata = merge_dictionaries(
|
||||||
existing_metadata[k] = sorted(set(v))
|
self.inline_metadata.copy(), new_metadata.copy()
|
||||||
elif isinstance(v, dict):
|
)
|
||||||
for kk, vv in v.items():
|
|
||||||
if isinstance(vv, list):
|
|
||||||
v[kk] = sorted(set(vv))
|
|
||||||
|
|
||||||
self.dict = dict(sorted(existing_metadata.items()))
|
if area == MetadataType.TAGS and isinstance(metadata, list):
|
||||||
|
self.tags.extend(metadata)
|
||||||
|
self.tags = sorted({s.strip("#") for s in self.tags})
|
||||||
|
|
||||||
def contains(self, key: str, value: str = None, is_regex: bool = False) -> bool:
|
def contains(
|
||||||
|
self, area: MetadataType, key: str = None, value: str = None, is_regex: bool = False
|
||||||
|
) -> bool:
|
||||||
"""Check if a key and/or a value exists in the metadata.
|
"""Check if a key and/or a value exists in the metadata.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
key (str): Key to check.
|
area (MetadataType): Type of metadata to check.
|
||||||
|
key (str, optional): Key to check.
|
||||||
value (str, optional): Value to check.
|
value (str, optional): Value to check.
|
||||||
is_regex (bool, optional): Use regex to check. Defaults to False.
|
is_regex (bool, optional): Use regex to check. Defaults to False.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
bool: True if the key exists.
|
bool: True if the key exists.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: Key must be provided when checking for a key's existence.
|
||||||
|
ValueError: Value must be provided when checking for a tag's existence.
|
||||||
"""
|
"""
|
||||||
|
if area != MetadataType.TAGS and key is None:
|
||||||
|
raise ValueError("Key must be provided when checking for a key's existence.")
|
||||||
|
|
||||||
|
match area: # noqa: E999
|
||||||
|
case MetadataType.ALL:
|
||||||
|
if dict_contains(self.dict, key, value, is_regex):
|
||||||
|
return True
|
||||||
|
if key is None and value is not None:
|
||||||
|
if is_regex:
|
||||||
|
return any(re.search(value, tag) for tag in self.tags)
|
||||||
|
return value in self.tags
|
||||||
|
|
||||||
|
case MetadataType.FRONTMATTER:
|
||||||
|
return dict_contains(self.frontmatter, key, value, is_regex)
|
||||||
|
case MetadataType.INLINE:
|
||||||
|
return dict_contains(self.inline_metadata, key, value, is_regex)
|
||||||
|
case MetadataType.KEYS:
|
||||||
return dict_contains(self.dict, key, value, is_regex)
|
return dict_contains(self.dict, key, value, is_regex)
|
||||||
|
case MetadataType.TAGS:
|
||||||
|
if value is None:
|
||||||
|
raise ValueError("Value must be provided when checking for a tag's existence.")
|
||||||
|
if is_regex:
|
||||||
|
return any(re.search(value, tag) for tag in self.tags)
|
||||||
|
return value in self.tags
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
def delete(self, key: str, value_to_delete: str = None) -> bool:
|
def delete(self, key: str, value_to_delete: str = None) -> bool:
|
||||||
"""Delete a key or a key's value from the metadata. Regex is supported to allow deleting more than one key or value.
|
"""Delete a key or a key's value from the metadata. Regex is supported to allow deleting more than one key or value.
|
||||||
@@ -99,35 +134,53 @@ class VaultMetadata:
|
|||||||
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def print_keys(self) -> None:
|
def print_metadata(self, area: MetadataType) -> None:
|
||||||
"""Print all metadata keys."""
|
"""Print metadata to the terminal.
|
||||||
columns = Columns(
|
|
||||||
sorted(self.dict.keys()),
|
|
||||||
equal=True,
|
|
||||||
expand=True,
|
|
||||||
title="All metadata keys in Obsidian vault",
|
|
||||||
)
|
|
||||||
print(columns)
|
|
||||||
|
|
||||||
def print_metadata(self) -> None:
|
Args:
|
||||||
"""Print all metadata."""
|
area (MetadataType): Type of metadata to print
|
||||||
table = Table(show_footer=False, show_lines=True)
|
"""
|
||||||
|
dict_to_print: dict[str, list[str]] = None
|
||||||
|
list_to_print: list[str] = None
|
||||||
|
match area:
|
||||||
|
case MetadataType.INLINE:
|
||||||
|
dict_to_print = self.inline_metadata.copy()
|
||||||
|
header = "All inline metadata"
|
||||||
|
case MetadataType.FRONTMATTER:
|
||||||
|
dict_to_print = self.frontmatter.copy()
|
||||||
|
header = "All frontmatter"
|
||||||
|
case MetadataType.TAGS:
|
||||||
|
list_to_print = []
|
||||||
|
for tag in self.tags:
|
||||||
|
list_to_print.append(f"#{tag}")
|
||||||
|
header = "All inline tags"
|
||||||
|
case MetadataType.KEYS:
|
||||||
|
list_to_print = sorted(self.dict.keys())
|
||||||
|
header = "All Keys"
|
||||||
|
case MetadataType.ALL:
|
||||||
|
dict_to_print = self.dict.copy()
|
||||||
|
list_to_print = []
|
||||||
|
for tag in self.tags:
|
||||||
|
list_to_print.append(f"#{tag}")
|
||||||
|
header = "All metadata"
|
||||||
|
|
||||||
|
if dict_to_print is not None:
|
||||||
|
table = Table(title=header, show_footer=False, show_lines=True)
|
||||||
table.add_column("Keys")
|
table.add_column("Keys")
|
||||||
table.add_column("Values")
|
table.add_column("Values")
|
||||||
for key, value in sorted(self.dict.items()):
|
for key, value in sorted(dict_to_print.items()):
|
||||||
values: str | dict[str, list[str]] = (
|
values: str | dict[str, list[str]] = (
|
||||||
"\n".join(sorted(value)) if isinstance(value, list) else value
|
"\n".join(sorted(value)) if isinstance(value, list) else value
|
||||||
)
|
)
|
||||||
table.add_row(f"[bold]{key}[/]", str(values))
|
table.add_row(f"[bold]{key}[/]", str(values))
|
||||||
Console().print(table)
|
Console().print(table)
|
||||||
|
|
||||||
def print_tags(self) -> None:
|
if list_to_print is not None:
|
||||||
"""Print all tags."""
|
|
||||||
columns = Columns(
|
columns = Columns(
|
||||||
sorted(self.dict["tags"]),
|
sorted(list_to_print),
|
||||||
equal=True,
|
equal=True,
|
||||||
expand=True,
|
expand=True,
|
||||||
title="All tags in Obsidian vault",
|
title=header if area != MetadataType.ALL else "All inline tags",
|
||||||
)
|
)
|
||||||
print(columns)
|
print(columns)
|
||||||
|
|
||||||
|
|||||||
@@ -392,6 +392,9 @@ class Note:
|
|||||||
typer.Exit: If the note's path is not found.
|
typer.Exit: If the note's path is not found.
|
||||||
"""
|
"""
|
||||||
p = self.note_path if path is None else path
|
p = self.note_path if path is None else path
|
||||||
|
if self.dry_run:
|
||||||
|
log.trace(f"DRY RUN: Writing note {p} to disk")
|
||||||
|
return
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with open(p, "w") as f:
|
with open(p, "w") as f:
|
||||||
|
|||||||
@@ -1,8 +1,9 @@
|
|||||||
"""Regexes for parsing frontmatter and note content."""
|
"""Regexes for parsing frontmatter and note content."""
|
||||||
|
|
||||||
import re
|
|
||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
from typing import Pattern
|
|
||||||
|
import regex as re
|
||||||
|
from regex import Pattern
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
@@ -11,7 +12,8 @@ class Patterns:
|
|||||||
|
|
||||||
find_inline_tags: Pattern[str] = re.compile(
|
find_inline_tags: Pattern[str] = re.compile(
|
||||||
r"""
|
r"""
|
||||||
(?:^|[ \|_,;:\*\(\)\[\]\\\.]) # Before tag is start of line or separator
|
(?:^|[ \|_,;:\*\)\[\]\\\.]|(?<!\])\() # Before tag is start of line or separator
|
||||||
|
(?<!\/\/[\w\d_\.\(\)\/&_-]+) # Before tag is not a link
|
||||||
\#([^ \|,;:\*\(\)\[\]\\\.\n#&]+) # Match tag until separator or end of line
|
\#([^ \|,;:\*\(\)\[\]\\\.\n#&]+) # Match tag until separator or end of line
|
||||||
""",
|
""",
|
||||||
re.MULTILINE | re.X,
|
re.MULTILINE | re.X,
|
||||||
|
|||||||
@@ -18,6 +18,15 @@ from obsidian_metadata.models.vault import Vault
|
|||||||
|
|
||||||
PATTERNS = Patterns()
|
PATTERNS = Patterns()
|
||||||
|
|
||||||
|
# Reset the default style of the questionary prompts qmark
|
||||||
|
questionary.prompts.checkbox.DEFAULT_STYLE = questionary.Style([("qmark", "")])
|
||||||
|
questionary.prompts.common.DEFAULT_STYLE = questionary.Style([("qmark", "")])
|
||||||
|
questionary.prompts.confirm.DEFAULT_STYLE = questionary.Style([("qmark", "")])
|
||||||
|
questionary.prompts.confirm.DEFAULT_STYLE = questionary.Style([("qmark", "")])
|
||||||
|
questionary.prompts.path.DEFAULT_STYLE = questionary.Style([("qmark", "")])
|
||||||
|
questionary.prompts.select.DEFAULT_STYLE = questionary.Style([("qmark", "")])
|
||||||
|
questionary.prompts.text.DEFAULT_STYLE = questionary.Style([("qmark", "")])
|
||||||
|
|
||||||
|
|
||||||
class Questions:
|
class Questions:
|
||||||
"""Class for asking questions to the user and validating responses with questionary."""
|
"""Class for asking questions to the user and validating responses with questionary."""
|
||||||
@@ -64,13 +73,13 @@ class Questions:
|
|||||||
"""
|
"""
|
||||||
self.style = questionary.Style(
|
self.style = questionary.Style(
|
||||||
[
|
[
|
||||||
("qmark", "fg:#729fcf bold"),
|
("qmark", "bold"),
|
||||||
("question", "fg:#729fcf bold"),
|
("question", "bold"),
|
||||||
("separator", "fg:#808080"),
|
("separator", "fg:#808080"),
|
||||||
("instruction", "fg:#808080"),
|
("instruction", "fg:#808080"),
|
||||||
("highlighted", "fg:#729fcf bold underline"),
|
("highlighted", "bold underline"),
|
||||||
("text", ""),
|
("text", ""),
|
||||||
("pointer", "fg:#729fcf bold"),
|
("pointer", "bold"),
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
self.vault = vault
|
self.vault = vault
|
||||||
@@ -85,7 +94,7 @@ class Questions:
|
|||||||
if len(text) < 1:
|
if len(text) < 1:
|
||||||
return "Tag cannot be empty"
|
return "Tag cannot be empty"
|
||||||
|
|
||||||
if not self.vault.contains_inline_tag(text):
|
if not self.vault.metadata.contains(area=MetadataType.TAGS, value=text):
|
||||||
return f"'{text}' does not exist as a tag in the vault"
|
return f"'{text}' does not exist as a tag in the vault"
|
||||||
|
|
||||||
return True
|
return True
|
||||||
@@ -99,7 +108,7 @@ class Questions:
|
|||||||
if len(text) < 1:
|
if len(text) < 1:
|
||||||
return "Key cannot be empty"
|
return "Key cannot be empty"
|
||||||
|
|
||||||
if not self.vault.metadata.contains(text):
|
if not self.vault.metadata.contains(area=MetadataType.KEYS, key=text):
|
||||||
return f"'{text}' does not exist as a key in the vault"
|
return f"'{text}' does not exist as a key in the vault"
|
||||||
|
|
||||||
return True
|
return True
|
||||||
@@ -118,7 +127,7 @@ class Questions:
|
|||||||
except re.error as error:
|
except re.error as error:
|
||||||
return f"Invalid regex: {error}"
|
return f"Invalid regex: {error}"
|
||||||
|
|
||||||
if not self.vault.metadata.contains(text, is_regex=True):
|
if not self.vault.metadata.contains(area=MetadataType.KEYS, key=text, is_regex=True):
|
||||||
return f"'{text}' does not exist as a key in the vault"
|
return f"'{text}' does not exist as a key in the vault"
|
||||||
|
|
||||||
return True
|
return True
|
||||||
@@ -169,7 +178,9 @@ class Questions:
|
|||||||
if len(text) < 1:
|
if len(text) < 1:
|
||||||
return "Value cannot be empty"
|
return "Value cannot be empty"
|
||||||
|
|
||||||
if self.key is not None and self.vault.metadata.contains(self.key, text):
|
if self.key is not None and self.vault.metadata.contains(
|
||||||
|
area=MetadataType.ALL, key=self.key, value=text
|
||||||
|
):
|
||||||
return f"{self.key}:{text} already exists"
|
return f"{self.key}:{text} already exists"
|
||||||
|
|
||||||
return True
|
return True
|
||||||
@@ -219,7 +230,9 @@ class Questions:
|
|||||||
if len(text) == 0:
|
if len(text) == 0:
|
||||||
return True
|
return True
|
||||||
|
|
||||||
if self.key is not None and not self.vault.metadata.contains(self.key, text):
|
if self.key is not None and not self.vault.metadata.contains(
|
||||||
|
area=MetadataType.ALL, key=self.key, value=text
|
||||||
|
):
|
||||||
return f"{self.key}:{text} does not exist"
|
return f"{self.key}:{text} does not exist"
|
||||||
|
|
||||||
return True
|
return True
|
||||||
@@ -241,11 +254,42 @@ class Questions:
|
|||||||
except re.error as error:
|
except re.error as error:
|
||||||
return f"Invalid regex: {error}"
|
return f"Invalid regex: {error}"
|
||||||
|
|
||||||
if self.key is not None and not self.vault.metadata.contains(self.key, text, is_regex=True):
|
if self.key is not None and not self.vault.metadata.contains(
|
||||||
|
area=MetadataType.ALL, key=self.key, value=text, is_regex=True
|
||||||
|
):
|
||||||
return f"No values in {self.key} match regex: {text}"
|
return f"No values in {self.key} match regex: {text}"
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
def ask_application_main(self) -> str: # pragma: no cover
|
||||||
|
"""Selectable list for the main application interface.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
style (questionary.Style): The style to use for the question.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
str: The selected application.
|
||||||
|
"""
|
||||||
|
return questionary.select(
|
||||||
|
"What do you want to do?",
|
||||||
|
choices=[
|
||||||
|
{"name": "Vault Actions", "value": "vault_actions"},
|
||||||
|
{"name": "Inspect Metadata", "value": "inspect_metadata"},
|
||||||
|
{"name": "Filter Notes in Scope", "value": "filter_notes"},
|
||||||
|
{"name": "Add Metadata", "value": "add_metadata"},
|
||||||
|
{"name": "Rename Metadata", "value": "rename_metadata"},
|
||||||
|
{"name": "Delete Metadata", "value": "delete_metadata"},
|
||||||
|
questionary.Separator("-------------------------------"),
|
||||||
|
{"name": "Review Changes", "value": "review_changes"},
|
||||||
|
{"name": "Commit Changes", "value": "commit_changes"},
|
||||||
|
questionary.Separator("-------------------------------"),
|
||||||
|
{"name": "Quit", "value": "abort"},
|
||||||
|
],
|
||||||
|
use_shortcuts=False,
|
||||||
|
style=self.style,
|
||||||
|
qmark="INPUT |",
|
||||||
|
).ask()
|
||||||
|
|
||||||
def ask_area(self) -> MetadataType | str: # pragma: no cover
|
def ask_area(self) -> MetadataType | str: # pragma: no cover
|
||||||
"""Ask the user for the metadata area to work on.
|
"""Ask the user for the metadata area to work on.
|
||||||
|
|
||||||
@@ -361,35 +405,6 @@ class Questions:
|
|||||||
qmark="INPUT |",
|
qmark="INPUT |",
|
||||||
).ask()
|
).ask()
|
||||||
|
|
||||||
def ask_application_main(self) -> str: # pragma: no cover
|
|
||||||
"""Selectable list for the main application interface.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
style (questionary.Style): The style to use for the question.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
str: The selected application.
|
|
||||||
"""
|
|
||||||
return questionary.select(
|
|
||||||
"What do you want to do?",
|
|
||||||
choices=[
|
|
||||||
{"name": "Vault Actions", "value": "vault_actions"},
|
|
||||||
{"name": "Inspect Metadata", "value": "inspect_metadata"},
|
|
||||||
{"name": "Filter Notes in Scope", "value": "filter_notes"},
|
|
||||||
{"name": "Add Metadata", "value": "add_metadata"},
|
|
||||||
{"name": "Rename Metadata", "value": "rename_metadata"},
|
|
||||||
{"name": "Delete Metadata", "value": "delete_metadata"},
|
|
||||||
questionary.Separator("-------------------------------"),
|
|
||||||
{"name": "Review Changes", "value": "review_changes"},
|
|
||||||
{"name": "Commit Changes", "value": "commit_changes"},
|
|
||||||
questionary.Separator("-------------------------------"),
|
|
||||||
{"name": "Quit", "value": "abort"},
|
|
||||||
],
|
|
||||||
use_shortcuts=False,
|
|
||||||
style=self.style,
|
|
||||||
qmark="INPUT |",
|
|
||||||
).ask()
|
|
||||||
|
|
||||||
def ask_new_key(self, question: str = "New key name") -> str: # pragma: no cover
|
def ask_new_key(self, question: str = "New key name") -> str: # pragma: no cover
|
||||||
"""Ask the user for a new metadata key.
|
"""Ask the user for a new metadata key.
|
||||||
|
|
||||||
@@ -422,7 +437,7 @@ class Questions:
|
|||||||
question, validate=self._validate_new_value, style=self.style, qmark="INPUT |"
|
question, validate=self._validate_new_value, style=self.style, qmark="INPUT |"
|
||||||
).ask()
|
).ask()
|
||||||
|
|
||||||
def ask_number(self, question: str = "Enter a number") -> int:
|
def ask_number(self, question: str = "Enter a number") -> int: # pragma: no cover
|
||||||
"""Ask the user for a number.
|
"""Ask the user for a number.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
@@ -435,6 +450,17 @@ class Questions:
|
|||||||
question, validate=self._validate_number, style=self.style, qmark="INPUT |"
|
question, validate=self._validate_number, style=self.style, qmark="INPUT |"
|
||||||
).ask()
|
).ask()
|
||||||
|
|
||||||
|
def ask_path(self, question: str = "Enter a path") -> str: # pragma: no cover
|
||||||
|
"""Ask the user for a path.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
question (str, optional): The question to ask. Defaults to "Enter a path".
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
str: A path.
|
||||||
|
"""
|
||||||
|
return questionary.path(question, style=self.style, qmark="INPUT |").ask()
|
||||||
|
|
||||||
def ask_selection(
|
def ask_selection(
|
||||||
self, choices: list[Any], question: str = "Select an option"
|
self, choices: list[Any], question: str = "Select an option"
|
||||||
) -> Any: # pragma: no cover
|
) -> Any: # pragma: no cover
|
||||||
|
|||||||
@@ -1,10 +1,11 @@
|
|||||||
"""Obsidian vault representation."""
|
"""Obsidian vault representation."""
|
||||||
|
|
||||||
|
import csv
|
||||||
import re
|
import re
|
||||||
import shutil
|
import shutil
|
||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
import json
|
||||||
import rich.repr
|
import rich.repr
|
||||||
from rich import box
|
from rich import box
|
||||||
from rich.console import Console
|
from rich.console import Console
|
||||||
@@ -46,6 +47,7 @@ class Vault:
|
|||||||
filters: list[VaultFilter] = [],
|
filters: list[VaultFilter] = [],
|
||||||
):
|
):
|
||||||
self.vault_path: Path = config.path
|
self.vault_path: Path = config.path
|
||||||
|
self.name = self.vault_path.name
|
||||||
self.dry_run: bool = dry_run
|
self.dry_run: bool = dry_run
|
||||||
self.backup_path: Path = self.vault_path.parent / f"{self.vault_path.name}.bak"
|
self.backup_path: Path = self.vault_path.parent / f"{self.vault_path.name}.bak"
|
||||||
self.exclude_paths: list[Path] = []
|
self.exclude_paths: list[Path] = []
|
||||||
@@ -132,10 +134,15 @@ class Vault:
|
|||||||
) as progress:
|
) as progress:
|
||||||
progress.add_task(description="Processing notes...", total=None)
|
progress.add_task(description="Processing notes...", total=None)
|
||||||
for _note in self.notes_in_scope:
|
for _note in self.notes_in_scope:
|
||||||
self.metadata.index_metadata(_note.frontmatter.dict)
|
|
||||||
self.metadata.index_metadata(_note.inline_metadata.dict)
|
|
||||||
self.metadata.index_metadata(
|
self.metadata.index_metadata(
|
||||||
{_note.inline_tags.metadata_key: _note.inline_tags.list}
|
area=MetadataType.FRONTMATTER, metadata=_note.frontmatter.dict
|
||||||
|
)
|
||||||
|
self.metadata.index_metadata(
|
||||||
|
area=MetadataType.INLINE, metadata=_note.inline_metadata.dict
|
||||||
|
)
|
||||||
|
self.metadata.index_metadata(
|
||||||
|
area=MetadataType.TAGS,
|
||||||
|
metadata=_note.inline_tags.list,
|
||||||
)
|
)
|
||||||
|
|
||||||
def add_metadata(self, area: MetadataType, key: str, value: str | list[str] = None) -> int:
|
def add_metadata(self, area: MetadataType, key: str, value: str | list[str] = None) -> int:
|
||||||
@@ -183,33 +190,21 @@ class Vault:
|
|||||||
|
|
||||||
alerts.success(f"Vault backed up to: {self.backup_path}")
|
alerts.success(f"Vault backed up to: {self.backup_path}")
|
||||||
|
|
||||||
def contains_inline_tag(self, tag: str, is_regex: bool = False) -> bool:
|
def commit_changes(self) -> None:
|
||||||
"""Check if vault contains the given inline tag.
|
"""Commit changes by writing to disk."""
|
||||||
|
log.debug("Writing changes to vault...")
|
||||||
|
if self.dry_run:
|
||||||
|
for _note in self.notes_in_scope:
|
||||||
|
if _note.has_changes():
|
||||||
|
alerts.dryrun(
|
||||||
|
f"writing changes to {_note.note_path.relative_to(self.vault_path)}"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
Args:
|
for _note in self.notes_in_scope:
|
||||||
tag (str): Tag to check for.
|
if _note.has_changes():
|
||||||
is_regex (bool, optional): Whether to use regex to match tag.
|
log.trace(f"writing to {_note.note_path}")
|
||||||
|
_note.write()
|
||||||
Returns:
|
|
||||||
bool: True if tag is found in vault.
|
|
||||||
"""
|
|
||||||
return any(_note.contains_inline_tag(tag) for _note in self.notes_in_scope)
|
|
||||||
|
|
||||||
def contains_metadata(self, key: str, value: str = None, is_regex: bool = False) -> bool:
|
|
||||||
"""Check if vault contains the given metadata.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
key (str): Key to check for. If value is None, will check vault for key.
|
|
||||||
value (str, optional): Value to check for.
|
|
||||||
is_regex (bool, optional): Whether to use regex to match key/value.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
bool: True if tag is found in vault.
|
|
||||||
"""
|
|
||||||
if value is None:
|
|
||||||
return self.metadata.contains(key, is_regex=is_regex)
|
|
||||||
|
|
||||||
return self.metadata.contains(key, value, is_regex=is_regex)
|
|
||||||
|
|
||||||
def delete_backup(self) -> None:
|
def delete_backup(self) -> None:
|
||||||
"""Delete the vault backup."""
|
"""Delete the vault backup."""
|
||||||
@@ -348,10 +343,44 @@ class Vault:
|
|||||||
|
|
||||||
return num_changed
|
return num_changed
|
||||||
|
|
||||||
def write(self) -> None:
|
def export_metadata(self, path: str, format: str = "csv") -> None:
|
||||||
"""Write changes to the vault."""
|
"""Write metadata to a csv file.
|
||||||
log.debug("Writing changes to vault...")
|
|
||||||
if self.dry_run is False:
|
Args:
|
||||||
for _note in self.notes_in_scope:
|
path (Path): Path to write csv file to.
|
||||||
log.trace(f"writing to {_note.note_path}")
|
export_as (str, optional): Export as 'csv' or 'json'. Defaults to "csv".
|
||||||
_note.write()
|
"""
|
||||||
|
export_file = Path(path).expanduser().resolve()
|
||||||
|
|
||||||
|
match format: # noqa: E999
|
||||||
|
case "csv":
|
||||||
|
with open(export_file, "w", encoding="UTF8") as f:
|
||||||
|
writer = csv.writer(f)
|
||||||
|
writer.writerow(["Metadata Type", "Key", "Value"])
|
||||||
|
|
||||||
|
for key, value in self.metadata.frontmatter.items():
|
||||||
|
if isinstance(value, list):
|
||||||
|
if len(value) > 0:
|
||||||
|
for v in value:
|
||||||
|
writer.writerow(["frontmatter", key, v])
|
||||||
|
else:
|
||||||
|
writer.writerow(["frontmatter", key, v])
|
||||||
|
|
||||||
|
for key, value in self.metadata.inline_metadata.items():
|
||||||
|
if isinstance(value, list):
|
||||||
|
if len(value) > 0:
|
||||||
|
for v in value:
|
||||||
|
writer.writerow(["inline_metadata", key, v])
|
||||||
|
else:
|
||||||
|
writer.writerow(["frontmatter", key, v])
|
||||||
|
for tag in self.metadata.tags:
|
||||||
|
writer.writerow(["tags", "", f"{tag}"])
|
||||||
|
case "json":
|
||||||
|
dict_to_dump = {
|
||||||
|
"frontmatter": self.metadata.dict,
|
||||||
|
"inline_metadata": self.metadata.inline_metadata,
|
||||||
|
"tags": self.metadata.tags,
|
||||||
|
}
|
||||||
|
|
||||||
|
with open(export_file, "w", encoding="UTF8") as f:
|
||||||
|
json.dump(dict_to_dump, f, indent=4, ensure_ascii=False, sort_keys=True)
|
||||||
|
|||||||
@@ -2,6 +2,9 @@
|
|||||||
"""Test metadata.py."""
|
"""Test metadata.py."""
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from obsidian_metadata.models.enums import MetadataType
|
||||||
from obsidian_metadata.models.metadata import (
|
from obsidian_metadata.models.metadata import (
|
||||||
Frontmatter,
|
Frontmatter,
|
||||||
InlineMetadata,
|
InlineMetadata,
|
||||||
@@ -11,6 +14,7 @@ from obsidian_metadata.models.metadata import (
|
|||||||
from tests.helpers import Regex
|
from tests.helpers import Regex
|
||||||
|
|
||||||
FILE_CONTENT: str = Path("tests/fixtures/test_vault/test1.md").read_text()
|
FILE_CONTENT: str = Path("tests/fixtures/test_vault/test1.md").read_text()
|
||||||
|
TAG_LIST: list[str] = ["tag 1", "tag 2", "tag 3"]
|
||||||
METADATA: dict[str, list[str]] = {
|
METADATA: dict[str, list[str]] = {
|
||||||
"frontmatter_Key1": ["author name"],
|
"frontmatter_Key1": ["author name"],
|
||||||
"frontmatter_Key2": ["note", "article"],
|
"frontmatter_Key2": ["note", "article"],
|
||||||
@@ -22,6 +26,7 @@ METADATA: dict[str, list[str]] = {
|
|||||||
"top_key3": ["top_key3_value"],
|
"top_key3": ["top_key3_value"],
|
||||||
"intext_key": ["intext_key_value"],
|
"intext_key": ["intext_key_value"],
|
||||||
}
|
}
|
||||||
|
METADATA_2: dict[str, list[str]] = {"key1": ["value1"], "key2": ["value2", "value3"]}
|
||||||
FRONTMATTER_CONTENT: str = """
|
FRONTMATTER_CONTENT: str = """
|
||||||
---
|
---
|
||||||
tags:
|
tags:
|
||||||
@@ -64,13 +69,28 @@ repeated_key:: repeated_key_value2
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
def test_vault_metadata(capsys) -> None:
|
def test_vault_metadata() -> None:
|
||||||
"""Test VaultMetadata class."""
|
"""Test VaultMetadata class."""
|
||||||
vm = VaultMetadata()
|
vm = VaultMetadata()
|
||||||
assert vm.dict == {}
|
assert vm.dict == {}
|
||||||
|
|
||||||
vm.index_metadata(METADATA)
|
vm.index_metadata(area=MetadataType.FRONTMATTER, metadata=METADATA)
|
||||||
|
vm.index_metadata(area=MetadataType.INLINE, metadata=METADATA_2)
|
||||||
|
vm.index_metadata(area=MetadataType.TAGS, metadata=TAG_LIST)
|
||||||
assert vm.dict == {
|
assert vm.dict == {
|
||||||
|
"frontmatter_Key1": ["author name"],
|
||||||
|
"frontmatter_Key2": ["article", "note"],
|
||||||
|
"intext_key": ["intext_key_value"],
|
||||||
|
"key1": ["value1"],
|
||||||
|
"key2": ["value2", "value3"],
|
||||||
|
"shared_key1": ["shared_key1_value"],
|
||||||
|
"shared_key2": ["shared_key2_value"],
|
||||||
|
"tags": ["tag 1", "tag 2", "tag 3"],
|
||||||
|
"top_key1": ["top_key1_value"],
|
||||||
|
"top_key2": ["top_key2_value"],
|
||||||
|
"top_key3": ["top_key3_value"],
|
||||||
|
}
|
||||||
|
assert vm.frontmatter == {
|
||||||
"frontmatter_Key1": ["author name"],
|
"frontmatter_Key1": ["author name"],
|
||||||
"frontmatter_Key2": ["article", "note"],
|
"frontmatter_Key2": ["article", "note"],
|
||||||
"intext_key": ["intext_key_value"],
|
"intext_key": ["intext_key_value"],
|
||||||
@@ -81,24 +101,28 @@ def test_vault_metadata(capsys) -> None:
|
|||||||
"top_key2": ["top_key2_value"],
|
"top_key2": ["top_key2_value"],
|
||||||
"top_key3": ["top_key3_value"],
|
"top_key3": ["top_key3_value"],
|
||||||
}
|
}
|
||||||
|
assert vm.inline_metadata == {"key1": ["value1"], "key2": ["value2", "value3"]}
|
||||||
vm.print_keys()
|
assert vm.tags == ["tag 1", "tag 2", "tag 3"]
|
||||||
captured = capsys.readouterr()
|
|
||||||
assert captured.out == Regex(r"frontmatter_Key1 +frontmatter_Key2 +intext_key")
|
|
||||||
|
|
||||||
vm.print_tags()
|
|
||||||
captured = capsys.readouterr()
|
|
||||||
assert captured.out == Regex(r"tag 1 +tag 2 +tag 3")
|
|
||||||
|
|
||||||
vm.print_metadata()
|
|
||||||
captured = capsys.readouterr()
|
|
||||||
assert captured.out == Regex(r"┃ Keys +┃ Values +┃")
|
|
||||||
assert captured.out == Regex(r"│ +│ tag 3 +│")
|
|
||||||
assert captured.out == Regex(r"│ frontmatter_Key1 +│ author name +│")
|
|
||||||
|
|
||||||
new_metadata = {"added_key": ["added_value"], "frontmatter_Key2": ["new_value"]}
|
new_metadata = {"added_key": ["added_value"], "frontmatter_Key2": ["new_value"]}
|
||||||
vm.index_metadata(new_metadata)
|
new_tags = ["tag 4", "tag 5"]
|
||||||
|
vm.index_metadata(area=MetadataType.FRONTMATTER, metadata=new_metadata)
|
||||||
|
vm.index_metadata(area=MetadataType.TAGS, metadata=new_tags)
|
||||||
assert vm.dict == {
|
assert vm.dict == {
|
||||||
|
"added_key": ["added_value"],
|
||||||
|
"frontmatter_Key1": ["author name"],
|
||||||
|
"frontmatter_Key2": ["article", "new_value", "note"],
|
||||||
|
"intext_key": ["intext_key_value"],
|
||||||
|
"key1": ["value1"],
|
||||||
|
"key2": ["value2", "value3"],
|
||||||
|
"shared_key1": ["shared_key1_value"],
|
||||||
|
"shared_key2": ["shared_key2_value"],
|
||||||
|
"tags": ["tag 1", "tag 2", "tag 3"],
|
||||||
|
"top_key1": ["top_key1_value"],
|
||||||
|
"top_key2": ["top_key2_value"],
|
||||||
|
"top_key3": ["top_key3_value"],
|
||||||
|
}
|
||||||
|
assert vm.frontmatter == {
|
||||||
"added_key": ["added_value"],
|
"added_key": ["added_value"],
|
||||||
"frontmatter_Key1": ["author name"],
|
"frontmatter_Key1": ["author name"],
|
||||||
"frontmatter_Key2": ["article", "new_value", "note"],
|
"frontmatter_Key2": ["article", "new_value", "note"],
|
||||||
@@ -110,13 +134,73 @@ def test_vault_metadata(capsys) -> None:
|
|||||||
"top_key2": ["top_key2_value"],
|
"top_key2": ["top_key2_value"],
|
||||||
"top_key3": ["top_key3_value"],
|
"top_key3": ["top_key3_value"],
|
||||||
}
|
}
|
||||||
|
assert vm.inline_metadata == {"key1": ["value1"], "key2": ["value2", "value3"]}
|
||||||
|
assert vm.tags == ["tag 1", "tag 2", "tag 3", "tag 4", "tag 5"]
|
||||||
|
|
||||||
|
|
||||||
|
def test_vault_metadata_print(capsys) -> None:
|
||||||
|
"""Test print_metadata method."""
|
||||||
|
vm = VaultMetadata()
|
||||||
|
vm.index_metadata(area=MetadataType.FRONTMATTER, metadata=METADATA)
|
||||||
|
vm.index_metadata(area=MetadataType.INLINE, metadata=METADATA_2)
|
||||||
|
vm.index_metadata(area=MetadataType.TAGS, metadata=TAG_LIST)
|
||||||
|
|
||||||
|
vm.print_metadata(area=MetadataType.ALL)
|
||||||
|
captured = capsys.readouterr()
|
||||||
|
assert "All metadata" in captured.out
|
||||||
|
assert "All inline tags" in captured.out
|
||||||
|
assert "┃ Keys ┃ Values ┃" in captured.out
|
||||||
|
assert "│ shared_key1 │ shared_key1_value │" in captured.out
|
||||||
|
assert captured.out == Regex("#tag 1 +#tag 2")
|
||||||
|
|
||||||
|
vm.print_metadata(area=MetadataType.FRONTMATTER)
|
||||||
|
captured = capsys.readouterr()
|
||||||
|
assert "All frontmatter" in captured.out
|
||||||
|
assert "┃ Keys ┃ Values ┃" in captured.out
|
||||||
|
assert "│ shared_key1 │ shared_key1_value │" in captured.out
|
||||||
|
assert "value1" not in captured.out
|
||||||
|
|
||||||
|
vm.print_metadata(area=MetadataType.INLINE)
|
||||||
|
captured = capsys.readouterr()
|
||||||
|
assert "All inline" in captured.out
|
||||||
|
assert "┃ Keys ┃ Values ┃" in captured.out
|
||||||
|
assert "shared_key1" not in captured.out
|
||||||
|
assert "│ key1 │ value1 │" in captured.out
|
||||||
|
|
||||||
|
vm.print_metadata(area=MetadataType.TAGS)
|
||||||
|
captured = capsys.readouterr()
|
||||||
|
assert "All inline tags " in captured.out
|
||||||
|
assert "┃ Keys ┃ Values ┃" not in captured.out
|
||||||
|
assert captured.out == Regex("#tag 1 +#tag 2")
|
||||||
|
|
||||||
|
vm.print_metadata(area=MetadataType.KEYS)
|
||||||
|
captured = capsys.readouterr()
|
||||||
|
assert "All Keys " in captured.out
|
||||||
|
assert "┃ Keys ┃ Values ┃" not in captured.out
|
||||||
|
assert captured.out != Regex("#tag 1 +#tag 2")
|
||||||
|
assert captured.out == Regex("frontmatter_Key1 +frontmatter_Key2")
|
||||||
|
|
||||||
|
|
||||||
def test_vault_metadata_contains() -> None:
|
def test_vault_metadata_contains() -> None:
|
||||||
"""Test contains method."""
|
"""Test contains method."""
|
||||||
vm = VaultMetadata()
|
vm = VaultMetadata()
|
||||||
vm.index_metadata(METADATA)
|
vm.index_metadata(area=MetadataType.FRONTMATTER, metadata=METADATA)
|
||||||
|
vm.index_metadata(area=MetadataType.INLINE, metadata=METADATA_2)
|
||||||
|
vm.index_metadata(area=MetadataType.TAGS, metadata=TAG_LIST)
|
||||||
assert vm.dict == {
|
assert vm.dict == {
|
||||||
|
"frontmatter_Key1": ["author name"],
|
||||||
|
"frontmatter_Key2": ["article", "note"],
|
||||||
|
"intext_key": ["intext_key_value"],
|
||||||
|
"key1": ["value1"],
|
||||||
|
"key2": ["value2", "value3"],
|
||||||
|
"shared_key1": ["shared_key1_value"],
|
||||||
|
"shared_key2": ["shared_key2_value"],
|
||||||
|
"tags": ["tag 1", "tag 2", "tag 3"],
|
||||||
|
"top_key1": ["top_key1_value"],
|
||||||
|
"top_key2": ["top_key2_value"],
|
||||||
|
"top_key3": ["top_key3_value"],
|
||||||
|
}
|
||||||
|
assert vm.frontmatter == {
|
||||||
"frontmatter_Key1": ["author name"],
|
"frontmatter_Key1": ["author name"],
|
||||||
"frontmatter_Key2": ["article", "note"],
|
"frontmatter_Key2": ["article", "note"],
|
||||||
"intext_key": ["intext_key_value"],
|
"intext_key": ["intext_key_value"],
|
||||||
@@ -127,21 +211,47 @@ def test_vault_metadata_contains() -> None:
|
|||||||
"top_key2": ["top_key2_value"],
|
"top_key2": ["top_key2_value"],
|
||||||
"top_key3": ["top_key3_value"],
|
"top_key3": ["top_key3_value"],
|
||||||
}
|
}
|
||||||
|
assert vm.inline_metadata == {"key1": ["value1"], "key2": ["value2", "value3"]}
|
||||||
|
assert vm.tags == ["tag 1", "tag 2", "tag 3"]
|
||||||
|
|
||||||
assert vm.contains("frontmatter_Key1") is True
|
with pytest.raises(ValueError):
|
||||||
assert vm.contains("frontmatter_Key2", "article") is True
|
vm.contains(area=MetadataType.ALL, value="key1")
|
||||||
assert vm.contains("frontmatter_Key3") is False
|
|
||||||
assert vm.contains("frontmatter_Key2", "no value") is False
|
assert vm.contains(area=MetadataType.ALL, key="no_key") is False
|
||||||
assert vm.contains("1$", is_regex=True) is True
|
assert vm.contains(area=MetadataType.ALL, key="key1") is True
|
||||||
assert vm.contains("5$", is_regex=True) is False
|
assert vm.contains(area=MetadataType.ALL, key="frontmatter_Key2", value="article") is True
|
||||||
assert vm.contains("tags", r"\d", is_regex=True) is True
|
assert vm.contains(area=MetadataType.ALL, key="frontmatter_Key2", value="none") is False
|
||||||
assert vm.contains("tags", r"^\d", is_regex=True) is False
|
assert vm.contains(area=MetadataType.ALL, key="1$", is_regex=True) is True
|
||||||
|
assert vm.contains(area=MetadataType.ALL, key=r"\d\d", is_regex=True) is False
|
||||||
|
|
||||||
|
assert vm.contains(area=MetadataType.FRONTMATTER, key="no_key") is False
|
||||||
|
assert vm.contains(area=MetadataType.FRONTMATTER, key="frontmatter_Key1") is True
|
||||||
|
assert (
|
||||||
|
vm.contains(area=MetadataType.FRONTMATTER, key="frontmatter_Key2", value="article") is True
|
||||||
|
)
|
||||||
|
assert vm.contains(area=MetadataType.FRONTMATTER, key="frontmatter_Key2", value="none") is False
|
||||||
|
assert vm.contains(area=MetadataType.FRONTMATTER, key="1$", is_regex=True) is True
|
||||||
|
assert vm.contains(area=MetadataType.FRONTMATTER, key=r"\d\d", is_regex=True) is False
|
||||||
|
|
||||||
|
assert vm.contains(area=MetadataType.INLINE, key="no_key") is False
|
||||||
|
assert vm.contains(area=MetadataType.INLINE, key="key1") is True
|
||||||
|
assert vm.contains(area=MetadataType.INLINE, key="key2", value="value3") is True
|
||||||
|
assert vm.contains(area=MetadataType.INLINE, key="key2", value="none") is False
|
||||||
|
assert vm.contains(area=MetadataType.INLINE, key="1$", is_regex=True) is True
|
||||||
|
assert vm.contains(area=MetadataType.INLINE, key=r"\d\d", is_regex=True) is False
|
||||||
|
|
||||||
|
assert vm.contains(area=MetadataType.TAGS, value="no_tag") is False
|
||||||
|
assert vm.contains(area=MetadataType.TAGS, value="tag 1") is True
|
||||||
|
assert vm.contains(area=MetadataType.TAGS, value=r"\w+ \d$", is_regex=True) is True
|
||||||
|
assert vm.contains(area=MetadataType.TAGS, value=r"\w+ \d\d$", is_regex=True) is False
|
||||||
|
with pytest.raises(ValueError):
|
||||||
|
vm.contains(area=MetadataType.TAGS, key="key1")
|
||||||
|
|
||||||
|
|
||||||
def test_vault_metadata_delete() -> None:
|
def test_vault_metadata_delete() -> None:
|
||||||
"""Test delete method."""
|
"""Test delete method."""
|
||||||
vm = VaultMetadata()
|
vm = VaultMetadata()
|
||||||
vm.index_metadata(METADATA)
|
vm.index_metadata(area=MetadataType.FRONTMATTER, metadata=METADATA)
|
||||||
assert vm.dict == {
|
assert vm.dict == {
|
||||||
"frontmatter_Key1": ["author name"],
|
"frontmatter_Key1": ["author name"],
|
||||||
"frontmatter_Key2": ["article", "note"],
|
"frontmatter_Key2": ["article", "note"],
|
||||||
@@ -165,7 +275,7 @@ def test_vault_metadata_delete() -> None:
|
|||||||
def test_vault_metadata_rename() -> None:
|
def test_vault_metadata_rename() -> None:
|
||||||
"""Test rename method."""
|
"""Test rename method."""
|
||||||
vm = VaultMetadata()
|
vm = VaultMetadata()
|
||||||
vm.index_metadata(METADATA)
|
vm.index_metadata(area=MetadataType.FRONTMATTER, metadata=METADATA)
|
||||||
assert vm.dict == {
|
assert vm.dict == {
|
||||||
"frontmatter_Key1": ["author name"],
|
"frontmatter_Key1": ["author name"],
|
||||||
"frontmatter_Key2": ["article", "note"],
|
"frontmatter_Key2": ["article", "note"],
|
||||||
|
|||||||
@@ -5,7 +5,7 @@ import pytest
|
|||||||
|
|
||||||
from obsidian_metadata.models.patterns import Patterns
|
from obsidian_metadata.models.patterns import Patterns
|
||||||
|
|
||||||
TAG_CONTENT: str = "#1 #2 **#3** [[#4]] [[#5|test]] #6#notag #7_8 #9/10 #11-12 #13; #14, #15. #16: #17* #18(#19) #20[#21] #22\\ #23& #24# #25 **#26** #📅/tag"
|
TAG_CONTENT: str = "#1 #2 **#3** [[#4]] [[#5|test]] #6#notag #7_8 #9/10 #11-12 #13; #14, #15. #16: #17* #18(#19) #20[#21] #22\\ #23& #24# #25 **#26** #📅/tag [link](#no_tag) https://example.com/somepage.html_#no_url_tags"
|
||||||
INLINE_METADATA: str = """
|
INLINE_METADATA: str = """
|
||||||
**1:: 1**
|
**1:: 1**
|
||||||
2_2:: [[2_2]] | 2
|
2_2:: [[2_2]] | 2
|
||||||
|
|||||||
@@ -110,7 +110,7 @@ def test_validate_value_exists_regex() -> None:
|
|||||||
def test_validate_new_value() -> None:
|
def test_validate_new_value() -> None:
|
||||||
"""Test new value validation."""
|
"""Test new value validation."""
|
||||||
questions = Questions(vault=VAULT, key="frontmatter_Key1")
|
questions = Questions(vault=VAULT, key="frontmatter_Key1")
|
||||||
assert questions._validate_new_value("new_value") is True
|
assert questions._validate_new_value("not_exists") is True
|
||||||
assert "Value cannot be empty" in questions._validate_new_value("")
|
assert "Value cannot be empty" in questions._validate_new_value("")
|
||||||
assert (
|
assert (
|
||||||
questions._validate_new_value("author name")
|
questions._validate_new_value("author name")
|
||||||
|
|||||||
@@ -16,6 +16,7 @@ def test_vault_creation(test_vault):
|
|||||||
vault_config = config.vaults[0]
|
vault_config = config.vaults[0]
|
||||||
vault = Vault(config=vault_config)
|
vault = Vault(config=vault_config)
|
||||||
|
|
||||||
|
assert vault.name == "vault"
|
||||||
assert vault.vault_path == vault_path
|
assert vault.vault_path == vault_path
|
||||||
assert vault.backup_path == Path(f"{vault_path}.bak")
|
assert vault.backup_path == Path(f"{vault_path}.bak")
|
||||||
assert vault.dry_run is False
|
assert vault.dry_run is False
|
||||||
@@ -23,16 +24,6 @@ def test_vault_creation(test_vault):
|
|||||||
assert len(vault.all_notes) == 3
|
assert len(vault.all_notes) == 3
|
||||||
|
|
||||||
assert vault.metadata.dict == {
|
assert vault.metadata.dict == {
|
||||||
"Inline Tags": [
|
|
||||||
"ignored_file_tag2",
|
|
||||||
"inline_tag_bottom1",
|
|
||||||
"inline_tag_bottom2",
|
|
||||||
"inline_tag_top1",
|
|
||||||
"inline_tag_top2",
|
|
||||||
"intext_tag1",
|
|
||||||
"intext_tag2",
|
|
||||||
"shared_tag",
|
|
||||||
],
|
|
||||||
"author": ["author name"],
|
"author": ["author name"],
|
||||||
"bottom_key1": ["bottom_key1_value"],
|
"bottom_key1": ["bottom_key1_value"],
|
||||||
"bottom_key2": ["bottom_key2_value"],
|
"bottom_key2": ["bottom_key2_value"],
|
||||||
@@ -58,6 +49,46 @@ def test_vault_creation(test_vault):
|
|||||||
"type": ["article", "note"],
|
"type": ["article", "note"],
|
||||||
}
|
}
|
||||||
|
|
||||||
|
assert vault.metadata.tags == [
|
||||||
|
"ignored_file_tag2",
|
||||||
|
"inline_tag_bottom1",
|
||||||
|
"inline_tag_bottom2",
|
||||||
|
"inline_tag_top1",
|
||||||
|
"inline_tag_top2",
|
||||||
|
"intext_tag1",
|
||||||
|
"intext_tag2",
|
||||||
|
"shared_tag",
|
||||||
|
]
|
||||||
|
assert vault.metadata.inline_metadata == {
|
||||||
|
"bottom_key1": ["bottom_key1_value"],
|
||||||
|
"bottom_key2": ["bottom_key2_value"],
|
||||||
|
"emoji_📅_key": ["emoji_📅_key_value"],
|
||||||
|
"intext_key": ["intext_value"],
|
||||||
|
"shared_key1": ["shared_key1_value"],
|
||||||
|
"shared_key2": ["shared_key2_value2"],
|
||||||
|
"top_key1": ["top_key1_value"],
|
||||||
|
"top_key2": ["top_key2_value"],
|
||||||
|
"top_key3": ["top_key3_value_as_link"],
|
||||||
|
}
|
||||||
|
assert vault.metadata.frontmatter == {
|
||||||
|
"author": ["author name"],
|
||||||
|
"date_created": ["2022-12-22"],
|
||||||
|
"frontmatter_Key1": ["author name"],
|
||||||
|
"frontmatter_Key2": ["article", "note"],
|
||||||
|
"ignored_frontmatter": ["ignore_me"],
|
||||||
|
"shared_key1": ["shared_key1_value"],
|
||||||
|
"shared_key2": ["shared_key2_value1"],
|
||||||
|
"tags": [
|
||||||
|
"frontmatter_tag1",
|
||||||
|
"frontmatter_tag2",
|
||||||
|
"frontmatter_tag3",
|
||||||
|
"ignored_file_tag1",
|
||||||
|
"shared_tag",
|
||||||
|
"📅/frontmatter_tag3",
|
||||||
|
],
|
||||||
|
"type": ["article", "note"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
def test_get_filtered_notes(sample_vault) -> None:
|
def test_get_filtered_notes(sample_vault) -> None:
|
||||||
"""Test filtering notes."""
|
"""Test filtering notes."""
|
||||||
@@ -190,18 +221,7 @@ def test_list_editable_notes(test_vault, capsys) -> None:
|
|||||||
vault.list_editable_notes()
|
vault.list_editable_notes()
|
||||||
captured = capsys.readouterr()
|
captured = capsys.readouterr()
|
||||||
assert captured.out == Regex("Notes in current scope")
|
assert captured.out == Regex("Notes in current scope")
|
||||||
assert captured.out == Regex(r"1 +test1\.md")
|
assert captured.out == Regex(r"\d +test1\.md")
|
||||||
|
|
||||||
|
|
||||||
def test_contains_inline_tag(test_vault) -> None:
|
|
||||||
"""Test if the vault contains an inline tag."""
|
|
||||||
vault_path = test_vault
|
|
||||||
config = Config(config_path="tests/fixtures/test_vault_config.toml", vault_path=vault_path)
|
|
||||||
vault_config = config.vaults[0]
|
|
||||||
vault = Vault(config=vault_config)
|
|
||||||
|
|
||||||
assert vault.contains_inline_tag("tag") is False
|
|
||||||
assert vault.contains_inline_tag("intext_tag2") is True
|
|
||||||
|
|
||||||
|
|
||||||
def test_add_metadata(test_vault) -> None:
|
def test_add_metadata(test_vault) -> None:
|
||||||
@@ -213,16 +233,6 @@ def test_add_metadata(test_vault) -> None:
|
|||||||
|
|
||||||
assert vault.add_metadata(MetadataType.FRONTMATTER, "new_key") == 3
|
assert vault.add_metadata(MetadataType.FRONTMATTER, "new_key") == 3
|
||||||
assert vault.metadata.dict == {
|
assert vault.metadata.dict == {
|
||||||
"Inline Tags": [
|
|
||||||
"ignored_file_tag2",
|
|
||||||
"inline_tag_bottom1",
|
|
||||||
"inline_tag_bottom2",
|
|
||||||
"inline_tag_top1",
|
|
||||||
"inline_tag_top2",
|
|
||||||
"intext_tag1",
|
|
||||||
"intext_tag2",
|
|
||||||
"shared_tag",
|
|
||||||
],
|
|
||||||
"author": ["author name"],
|
"author": ["author name"],
|
||||||
"bottom_key1": ["bottom_key1_value"],
|
"bottom_key1": ["bottom_key1_value"],
|
||||||
"bottom_key2": ["bottom_key2_value"],
|
"bottom_key2": ["bottom_key2_value"],
|
||||||
@@ -248,18 +258,27 @@ def test_add_metadata(test_vault) -> None:
|
|||||||
"top_key3": ["top_key3_value_as_link"],
|
"top_key3": ["top_key3_value_as_link"],
|
||||||
"type": ["article", "note"],
|
"type": ["article", "note"],
|
||||||
}
|
}
|
||||||
|
assert vault.metadata.frontmatter == {
|
||||||
|
"author": ["author name"],
|
||||||
|
"date_created": ["2022-12-22"],
|
||||||
|
"frontmatter_Key1": ["author name"],
|
||||||
|
"frontmatter_Key2": ["article", "note"],
|
||||||
|
"ignored_frontmatter": ["ignore_me"],
|
||||||
|
"new_key": [],
|
||||||
|
"shared_key1": ["shared_key1_value"],
|
||||||
|
"shared_key2": ["shared_key2_value1"],
|
||||||
|
"tags": [
|
||||||
|
"frontmatter_tag1",
|
||||||
|
"frontmatter_tag2",
|
||||||
|
"frontmatter_tag3",
|
||||||
|
"ignored_file_tag1",
|
||||||
|
"shared_tag",
|
||||||
|
"📅/frontmatter_tag3",
|
||||||
|
],
|
||||||
|
"type": ["article", "note"],
|
||||||
|
}
|
||||||
assert vault.add_metadata(MetadataType.FRONTMATTER, "new_key2", "new_key2_value") == 3
|
assert vault.add_metadata(MetadataType.FRONTMATTER, "new_key2", "new_key2_value") == 3
|
||||||
assert vault.metadata.dict == {
|
assert vault.metadata.dict == {
|
||||||
"Inline Tags": [
|
|
||||||
"ignored_file_tag2",
|
|
||||||
"inline_tag_bottom1",
|
|
||||||
"inline_tag_bottom2",
|
|
||||||
"inline_tag_top1",
|
|
||||||
"inline_tag_top2",
|
|
||||||
"intext_tag1",
|
|
||||||
"intext_tag2",
|
|
||||||
"shared_tag",
|
|
||||||
],
|
|
||||||
"author": ["author name"],
|
"author": ["author name"],
|
||||||
"bottom_key1": ["bottom_key1_value"],
|
"bottom_key1": ["bottom_key1_value"],
|
||||||
"bottom_key2": ["bottom_key2_value"],
|
"bottom_key2": ["bottom_key2_value"],
|
||||||
@@ -286,19 +305,26 @@ def test_add_metadata(test_vault) -> None:
|
|||||||
"top_key3": ["top_key3_value_as_link"],
|
"top_key3": ["top_key3_value_as_link"],
|
||||||
"type": ["article", "note"],
|
"type": ["article", "note"],
|
||||||
}
|
}
|
||||||
|
assert vault.metadata.frontmatter == {
|
||||||
|
"author": ["author name"],
|
||||||
def test_contains_metadata(test_vault) -> None:
|
"date_created": ["2022-12-22"],
|
||||||
"""Test if the vault contains a metadata key."""
|
"frontmatter_Key1": ["author name"],
|
||||||
vault_path = test_vault
|
"frontmatter_Key2": ["article", "note"],
|
||||||
config = Config(config_path="tests/fixtures/test_vault_config.toml", vault_path=vault_path)
|
"ignored_frontmatter": ["ignore_me"],
|
||||||
vault_config = config.vaults[0]
|
"new_key": [],
|
||||||
vault = Vault(config=vault_config)
|
"new_key2": ["new_key2_value"],
|
||||||
|
"shared_key1": ["shared_key1_value"],
|
||||||
assert vault.contains_metadata("key") is False
|
"shared_key2": ["shared_key2_value1"],
|
||||||
assert vault.contains_metadata("top_key1") is True
|
"tags": [
|
||||||
assert vault.contains_metadata("top_key1", "no_value") is False
|
"frontmatter_tag1",
|
||||||
assert vault.contains_metadata("top_key1", "top_key1_value") is True
|
"frontmatter_tag2",
|
||||||
|
"frontmatter_tag3",
|
||||||
|
"ignored_file_tag1",
|
||||||
|
"shared_tag",
|
||||||
|
"📅/frontmatter_tag3",
|
||||||
|
],
|
||||||
|
"type": ["article", "note"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
def test_delete_inline_tag(test_vault) -> None:
|
def test_delete_inline_tag(test_vault) -> None:
|
||||||
@@ -310,7 +336,7 @@ def test_delete_inline_tag(test_vault) -> None:
|
|||||||
|
|
||||||
assert vault.delete_inline_tag("no tag") == 0
|
assert vault.delete_inline_tag("no tag") == 0
|
||||||
assert vault.delete_inline_tag("intext_tag2") == 2
|
assert vault.delete_inline_tag("intext_tag2") == 2
|
||||||
assert vault.metadata.dict["Inline Tags"] == [
|
assert vault.metadata.tags == [
|
||||||
"ignored_file_tag2",
|
"ignored_file_tag2",
|
||||||
"inline_tag_bottom1",
|
"inline_tag_bottom1",
|
||||||
"inline_tag_bottom2",
|
"inline_tag_bottom2",
|
||||||
@@ -347,7 +373,7 @@ def test_rename_inline_tag(test_vault) -> None:
|
|||||||
|
|
||||||
assert vault.rename_inline_tag("no tag", "new_tag") == 0
|
assert vault.rename_inline_tag("no tag", "new_tag") == 0
|
||||||
assert vault.rename_inline_tag("intext_tag2", "new_tag") == 2
|
assert vault.rename_inline_tag("intext_tag2", "new_tag") == 2
|
||||||
assert vault.metadata.dict["Inline Tags"] == [
|
assert vault.metadata.tags == [
|
||||||
"ignored_file_tag2",
|
"ignored_file_tag2",
|
||||||
"inline_tag_bottom1",
|
"inline_tag_bottom1",
|
||||||
"inline_tag_bottom2",
|
"inline_tag_bottom2",
|
||||||
|
|||||||
Reference in New Issue
Block a user