Skip to main content

Python Black formatting method

I have a project that have been forked and it is being formatted using Black code formatter(22.3.0) which is same version as the one defined in the original project (JinaAI). However the formatting is different and it throws off diffs.

Any suggestions are greatly appreciated to any/all issues.

My pyproject.toml settings :

[tool.black]
line-length = 88
target-version = ['py38']
include = '\.pyi?$'
preview = false
skip-string-normalization = true

Only setting that is being applied on their project is S flag which is skip-string-normalization = true example 'black -S', there are no other configs that I found.

From .pre-commit-config.yaml

- repo: https://github.com/ambv/black
  rev: 22.3.0
  hooks:
  - id: black
    types: [python]
    exclude: ^(jina/proto/pb/jina_pb2.py|jina/proto/pb/jina_pb2_grpc.py|jina/proto/pb2/jina_pb2.py|jina/proto/pb2/jina_pb2_grpc.py|docs/|jina/resources/)
    args:
      - -S

I left formatting as much as I can here so it does not change the output.

Issue # 1

Indentation is 8 spaces instead of 4.

mine:

        if (
            item == 'load_config' and inspect.ismethod(obj) and obj.__self__ is Flow
        ):  # check if obj load config call from an instance and not the Class
            warnings.warn(
                "Calling `load_config` from a Flow instance will override all of the instance's initial parameters. We recommend to use `Flow.load_config(...)` instead"
            )

theirs:


        if (
                item == 'load_config' and inspect.ismethod(obj) and obj.__self__ is Flow
        ):  # check if obj load config call from an instance and not the Class
            warnings.warn(
                "Calling `load_config` from a Flow instance will override all of the instance's initial parameters. We recommend to use `Flow.load_config(...)` instead"
            )

Issue # 2

Indentation is 8 spaces instead of 4 on methods

mine:

    def to_kubernetes_yaml(
        self,
        output_base_path: str,
        k8s_namespace: Optional[str] = None,
        include_gateway: bool = True,
    ):

theirs:

    def to_kubernetes_yaml(
            self,
            output_base_path: str,
            k8s_namespace: Optional[str] = None,
            include_gateway: bool = True,
    ):

Issue # 3

Not all parameters are broken down into separate lines. Example here is *

Mine has ) on separate line theirs does not.

mine:

    def __init__(
        self,
        *,
        env: Optional[dict] = None,
        inspect: Optional[str] = 'COLLECT',
        log_config: Optional[str] = None,
        name: Optional[str] = None,
        quiet: Optional[bool] = False,
        quiet_error: Optional[bool] = False,
        reload: Optional[bool] = False,
        uses: Optional[str] = None,
        workspace: Optional[str] = None,
        **kwargs,
    ):

theirs:

    def __init__(
        self,*,
        env: Optional[dict] = None, 
        inspect: Optional[str] = 'COLLECT', 
        log_config: Optional[str] = None, 
        name: Optional[str] = None, 
        quiet: Optional[bool] = False, 
        quiet_error: Optional[bool] = False, 
        reload: Optional[bool] = False, 
        uses: Optional[str] = None, 
        workspace: Optional[str] = None, 
        **kwargs):

Issue # 4

Text alignment is not same.

mine:

    set_deployment_parser(
        sp.add_parser(
            'deployment',
            description='Start a Deployment. '
            'You should rarely use this directly unless you '
            'are doing low-level orchestration',
            formatter_class=_chf,
            **(dict(help='Start a Deployment')) if _SHOW_ALL_ARGS else {},
        )
    )

theirs:

    set_deployment_parser(
        sp.add_parser(
            'deployment',
            description='Start a Deployment. '
                        'You should rarely use this directly unless you '
                        'are doing low-level orchestration',
            formatter_class=_chf,
            **(dict(help='Start a Deployment')) if _SHOW_ALL_ARGS else {},
        )
    )


source https://stackoverflow.com/questions/74837142/python-black-formatting-method

Comments

Popular posts from this blog

ValueError: X has 10 features, but LinearRegression is expecting 1 features as input

So, I am trying to predict the model but its throwing error like it has 10 features but it expacts only 1. So I am confused can anyone help me with it? more importantly its not working for me when my friend runs it. It works perfectly fine dose anyone know the reason about it? cv = KFold(n_splits = 10) all_loss = [] for i in range(9): # 1st for loop over polynomial orders poly_order = i X_train = make_polynomial(x, poly_order) loss_at_order = [] # initiate a set to collect loss for CV for train_index, test_index in cv.split(X_train): print('TRAIN:', train_index, 'TEST:', test_index) X_train_cv, X_test_cv = X_train[train_index], X_test[test_index] t_train_cv, t_test_cv = t[train_index], t[test_index] reg.fit(X_train_cv, t_train_cv) loss_at_order.append(np.mean((t_test_cv - reg.predict(X_test_cv))**2)) # collect loss at fold all_loss.append(np.mean(loss_at_order)) # collect loss at order plt.plot(np.log(al...

Sorting large arrays of big numeric stings

I was solving bigSorting() problem from hackerrank: Consider an array of numeric strings where each string is a positive number with anywhere from to digits. Sort the array's elements in non-decreasing, or ascending order of their integer values and return the sorted array. I know it works as follows: def bigSorting(unsorted): return sorted(unsorted, key=int) But I didnt guess this approach earlier. Initially I tried below: def bigSorting(unsorted): int_unsorted = [int(i) for i in unsorted] int_sorted = sorted(int_unsorted) return [str(i) for i in int_sorted] However, for some of the test cases, it was showing time limit exceeded. Why is it so? PS: I dont know exactly what those test cases were as hacker rank does not reveal all test cases. source https://stackoverflow.com/questions/73007397/sorting-large-arrays-of-big-numeric-stings

How to load Javascript with imported modules?

I am trying to import modules from tensorflowjs, and below is my code. test.html <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <title>Document</title </head> <body> <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@2.0.0/dist/tf.min.js"></script> <script type="module" src="./test.js"></script> </body> </html> test.js import * as tf from "./node_modules/@tensorflow/tfjs"; import {loadGraphModel} from "./node_modules/@tensorflow/tfjs-converter"; const MODEL_URL = './model.json'; const model = await loadGraphModel(MODEL_URL); const cat = document.getElementById('cat'); model.execute(tf.browser.fromPixels(cat)); Besides, I run the server using python -m http.server in my command prompt(Windows 10), and this is the error prompt in the console log of my browser: Failed to loa...