• 1 Post
  • 18 Comments
Joined 9 months ago
cake
Cake day: March 1st, 2024

help-circle
  • To make this interesting, you could have asked for a stipulation that i fund the other side of the bet. Oddly, you didn’t insist on that. Once the bet is funded, drop the make it interesting claim, that’s what a bet is.

    If i were in your shoes

    And believed strongly in your predictions i’d do due diligence.

    1. Read thru the persons github acnt. All the code and mercilessly, but fairly, do a public code review of all public packages, documentation, and commit style. What code quality does this person produce? Whats the likelihood this person could be prepared to possibly collaborate with others?

    This is normally enough to evaluate someone. I’ve ripped people apart who’ve presented themselves as Python coders and were actually base amateurs.

    Here is my github acnt. I submit to ur code review. Meaning, during the code review, any concerns you bring up, i have to defend my actions. Whatever public humiliation you have in store for me, cannot complain or retaliate.

    Notice there are no code of conduct files in any of the packages. Free your inner troll and be merciless!

    https://github.com/msftcangoblowm

    While there, if you like a package, star it

    1. Understand the problem

    What would it take to create a solution to this issue?

    Does the person, with that github history, clearly understand the issue? Enough to come up with a viable solution?

    If had doubts, would admit and say, not confident enough in the prediction any bet could possibly go very wrong. Could admit to having serious doubts without shame.

    If had confidence in the prediction, having conducted due diligence, would call the bluff and take the guys money

    You took the third option, get called out and proven a non-risk taker and someone who doesn’t bother doing their own research. But doesn’t mind throwing shade at everyone and everything.


  • Strategy --> deflection

    Involve other people, not yourself; reframe the discussion.

    So your predictions are worthless cuz you are unwilling to take on any risk.

    Coding involves risk and those willing to take on risk. A gambling man you are not!

    Can throw shade and FUD around all day long everyday without consequence or care. Cuz u offer nor put any skin in the game.

    Just empty words like a secretary giving a language skill assessment.

    So if i said, i see ghosts and dragons and can shit rainbows out of my butt, you’d be too weak to call the bluff.


  • more likely

    That almost sounds like you might consider to jump on polymarket, initiate the prediction, put money down on that prediction creating a position, but need a little nudge.

    Are you willing to make that bet? The size of the bet reflects how strongly you feel. Are you going to make this interesting?

    The other side of that bet would be:

    Could become that guy who extends the theory, makes a better way of doing it, and creates and publishes the package and docs.

    And the world+dog recognizes the package amongst the other tools in this genre, rather than i conforming to existing tools (uv or poetry or pip-compile-multi).

    In your favor, there are three tools. So three people/teams on this planet have presented a solution. Can count that on one hand with fingers to spare!

    On the other hand, lets keep in mind, this is a Python specific forum and everyone here are skilled super talented coders and probably full on freak’n geniuses (lifts hand, pinky to closest edge of mouth, everyone looks around at one another and copies, then looks back at you with an errie almost coordinated synchronized eye brow raise). And i oddly posted about this exact topic. Literally anyone and everyone who has commented could be that guy.

    scratches head

    looks up with one eye to check star positions

    rubs chin

    occasional alternating strong eye brow movements …

    (with hand on chin) who is this guy, should i call his bluff by taking a position? Whats the likelihood he’s secretly a closet poetry user and just some poser?

    If you won, could you be sad?

    If you lost, not get upset or ego hurt instead be much happier with the published tool over the money?

    What are the odds looking like on this particular prediction?

    Looking forward to you posting the URL to the prediction on polymarket then promoting the market to maximize your returns. First in and clean house. Rinse wash and repeat with this blowhard wannabe (referring to myself).






  • UNIX philosophy. One tool that does one thing well

    Best to have a damn good reason when breaking this principle (e.g. vendoring) or be funded by Money McBags

    requirements files are requirements files, not venvs. They may install into venv, but they are not venvs themselves. The only thing a venv provides that is of interest to ur requirements files are: the relative folder path (e.g. ‘.venv’) and python interpreter path. Nothing more. When using tox, the py version is hardcoded, so only need to provide the relative folder path.

    The venv management tools we have are sufficient. the problem is not the venv, it’s managing the requirements files.

    Your 1 tool suacks just as much as my 5 tools when it comes to managing requirement files. None of them do the job.


  • Within the context of resolving dependency conflicts, poetry decided pyproject.toml is a great place to put requirements.

    This is what people know.

    pyproject.toml or venv management should otherwise never come into the conversation.

    My personal opinion is: venv, pip, pyenv, pip-tools, and tox are sufficient to manage venvs.

    venvs are not required to manage requirement files. It’s a convenience so dev tools are accessible.

    Currently the options are: poetry or uv.

    With honorable mention to pip-compile-multi, which locks dependencies.

    poetry and uv manage venvs… Why?


  • That’s a loaded question. Would like to avoid answering atm. Would lead to a package release announcement which this post is not; not prepared to right right now.

    Instead here is an admittedly unsatisfactory response which i apologize for.

    Wish to have the option to, later, take it back and give the straight exact answer which your question deserves.

    my use case is your use case and everyone else’s use case.

    Avoiding dependency hell while keeping things easily manageable. Breaking up complexity into smallest pieces possible. And having a CLI tool to fix what’s fixable while reporting on what’s not.

    My preference is to do this beforehand.




  • my position is it’s not messy enough

    Lets start off by admitting what the goal is.

    We all want to avoid dependency hell.

    Our primary interest is not merely cleaning up the mess of requirements files.

    Cleaning up the mess results in some unintended consequences:

    1. noise
    2. complexity
    3. confusion

    noise

    All the requirements information is in one place. Sounds great until want to tackle and document very specific issues.

    Like when Sphinx dropped support for py39, myst-parser restricted the Sphinx upper bound version, fixed it in a commit, but did not create a release.

    Or cffi, every single commit just blows our mind. Adding support for things we all want. So want to set a lower bound cffi version.

    My point being, these are all specific issues and should be dealt with separately. And when it’s no longer relevant, know exactly what to remove. Zero noise.

    complexity

    When things go horribly wrong, the wrapper gets in the way. So now have to deal with both the wrapper and the issue. So there is both a learning curve, an API interface, and increased required know how.

    The simple answer here is, do not do that.

    confusion

    When a dependency hell issue arises, have to deal with that and find ourselves drawn to poetry or uv documentation. The issue has nothing to do with either. But we are looking towards them to see how others solve it, in the poetry or uv way.

    The only know-how that should be needed is whats in the pip docs.

    Whats ur suggestion?

    Would prefer to deal with dependency hell before it happens. To do this, the requirements files are broken up, so they are easier to deal with.

    Centralizing everything into pyproject.toml does the opposite.

    Rather than dealing with the issue beforehand, get to deal with it good and hard afterwards.





  • To keep it simple

    testing and static type checking – catches all the bugs

    linting and formatters – so git diff isn’t pure noise showing trailing and unnecessary whitespace and collaborators won’t have to go back to correct things that coulda been automagically fixed.

    in code documentation – Can be extracted by Sphinx as part of the documentation process. Hint: interrogate is your friend.

    gh workflows – to have the test suite run against various py versions, os, and maybe architectures. Without which not even confident it runs well on your own machine let alone anywhere else.

    requirements.txt – is an output file. Where is requirements.in ??

    xz hacker sends his love

    Makefile – for people who like a ton of shell scripts in their Python packages. Up until realize that ya know which Python interpreter is being run, but can’t have any level of confidence about the shell interpreter. Cuz it’s a big unknown and unknowable. Gotta just take it on faith.