Technetium 2 days ago

They proclaim "privacy-respecting" but all your keystrokes go to OpenAI. Horrific and genuinely upsetting.

Edit: The author replied to another comment that there is an intent to add local AI. If that is the plan, then fix the wording until it can actually be considered privacy-respecting: https://news.ycombinator.com/item?id=41579144

Alex4386 2 days ago

People really should stop calling a glorified openAI API as an open-source software.

  • jillesvangurp 2 days ago

    There are several free alternatives to OpenAI that use the same API; which would make it possible to substitute OpenAI for one of those models in this extension. At least on paper. There is an open issue on the github repository requesting something like that.

    So, it's not as clear cut. The general approach of using LLMs for this is not a bad one; LLMs are pretty good at this stuff.

    • dotancohen 2 days ago

      Yes, but the API at the end is providing the core functionality. Simply swapping out one LLM model for another - let alone by a different company altogether - will completely change the effectiveness and usefulness of the application.

      • Tepix 2 days ago

        Well, as we see with AI applications like "Leo AI" and "Continue", using a locally run LLM can be fantastic replacements for proprietary offerings.

        • dartos 2 days ago

          FWIW I’ve found local models to be essentially useless for coding tasks.

          • Tepix 2 days ago

            Really? Maybe your models are too small?

            • spmurrayzzz a day ago

              The premier open weight models don't even comparatively perform well on the public benchmarks compared to frontier models. And that's assuming at least some degree of benchmark contamination for the open weight models.

              While I don't think they're completely useless (though its close), calling them fantastic replacements feels like an egregious overstatement of their value.

              EDIT: Also wanted to note that I think this becomes as much an expectations-setting exercise as it is evaluation on raw programming performance. Some people are incredibly impressed by the ability to assist in building simple web apps, others not so much. Experience will vary across that continuum.

              • dartos a day ago

                Yeah, in my comparing deepseek coder 2 lite (the best coding model I can find that’ll run on my 4090) to Claud sonnet under aider…

                Deep seek lite was essentially useless. Too slow and too low quality edits.

                I’ve been programming for about 17 years, so the things I want aider to do are a little more specific than building simple web apps. Larger models are just better at it.

                I can run the full deepseek coder model on some cloud and probably get very acceptable results, but then it’s no longer local.

            • websap a day ago

              Woah woah! Those are fighting words. /s

      • dartos 2 days ago

        One would hope, that since the problem these models are trying to solve is language modeling, they would eventually converge around similar capabilities

      • JCharante 2 days ago

        everyone stands on the shoulders of giants.

        • sham1 2 days ago

          Things standing on the shoulders of proprietary giants shouldn't claim to be free software/open source.

          • t-writescode 2 days ago

            Their interfacing software __is__ open source; and, they're asking for your OpenAI api key to operate. I would expect / desire open source code if I were to use that, so I could be sure my api key was only being used for my work, so it's only my work that I'm paying for and it's not been stolen in some way.

        • noduerme 2 days ago

          My older brother who got me into coding learned to code in Assembly. He doesn't really consider most of my work writing in high level languages to be "coding". So maybe there's something here. But if I had to get into the underlying structure, I could. I do wonder whether the same can be said for people who just kludge together a bunch of APIs that produce magical result sets.

          • dotancohen 2 days ago

              > But if I had to get into the underlying structure, I could.
            
            How do you propose to get into the underlying structure of the OpenAPI API? Breach their network and steal their code and models? I don't understand what you're arguing.
            • latexr 2 days ago

              > How do you propose to get into the underlying structure of the OpenAPI API?

              The fact that you can’t is the point of the comment. You could get into the underlying structure of other things, like the C interpreter of a scripting language.

              • robertlagrant 2 days ago

                But what about the microcode inside the CPU?

                • zja a day ago

                  That tends to not be open source, and people don’t claim that it is.

            • seadan83 2 days ago

              I think the argument is that stitching things together at a high level is not really coding. A bit of a no true scotsmen perspective. The example is that anything more abstract than assembly is not even true coding, let alone creating a wrapper layer around an LLM

            • K0balt 2 days ago

              I think the relevant analogy here would be to run a local model. There are several tools to easily run local models for a local API. I run a 70b finetune with some tool use locally on our farm, and it is accessible to all users as a local openAI alternative. For most applications it is adequate and data stays on the campus area network.

              • noduerme 20 hours ago

                A more accurate analogy would be, are you capable of finding and correcting errors in the model at the neural level if necessary? Do you have an accurate mental picture of how it performs its tasks, in a way that allows you to predictably control its output, if not actually modify it? If not, you're mostly smashing very expensive matchbox cars together, rather than doing anything resembling programming.

                • K0balt 2 hours ago

                  As an ancient imbedded system programmer, I feel your frustration… but I think that it’s misguided. LLMs are not “computers”. They are a statistics driven tool for navigating human written (and graphical) culture.

                  It just so happens to be that a lot of useful stuff is in that box, and LLMs are handy at bringing it out in context. Getting them to “think” is tricky, and it’s best to remember that what you are really doing is trying to get them to talk as if they were thinking.

                  It sure as heck isn’t programming lol.

                  Also, it’s useful to keep in mind that “hallucinations “ are not malfunctions. If you were to change parameters to eliminate hallucinations, you would lose the majority of the unusual usefulness of the tool, its ability to synthesise and recombine ideas in statistically plausible (but otherwise random) ways. It’s almost like imagination. People imagine goofy shit all the time too.

                  At any rate, using agentic scripting you can get it to follow a kind of plan, and it can get pretty close to an actual “train of thought”facsimile for some kinds of tasks.

                  There are some really solid use cases, actually, but I’d say mostly they aren’t the ones trying to get LLMs to replace higher level tasks. They are actually really good at doing rote menial things. The best LLMs apps are going to be the boring ones.

  • guappa 2 days ago

    This stuff is starting to enter debian as well -_-'

  • zlwaterfield 2 days ago

    Plan is to add local LLM support so goal is fully OSS, agree initial wording could have been better.

slg 2 days ago

I have been using LanguageTool[1] for years as "an open source alternative to [old school] Grammarly". It doesn't do that fancy "make this text more professional" AI stuff like this or Grammarly can now do, but they offer a self-hosted version so you don't need to send everything you write to OpenAI. If all you want is a better spelling/grammar checker, I highly recommend it.

[1] - https://github.com/languagetool-org/languagetool

  • dspillett 2 days ago

    You can also run your own local instance for the in-browser checking, which is handy for me as I need to be careful about sending text off to another company in another country (due to both client security requirements and personal paranoia!).

    You don't get the AI based extras like paraphrasing, and the other bits listed in as premium only (https://languagetool.org/premium_new), but if you install the n-gram DB for your language (https://languagetool.org/download/ngram-data/) I found it at least as good as, for some examples better than, Grammarly's free offering last time I did a comparison.

  • weinzierl 2 days ago

    It's great. I had a subscription for Grammarly for a couple of years and used both tools in parallel, but found myself mostly using languagetool increasingly. It is strictly better, I'd say even for English but certainly if you need other languages or deal with multilingual documents. So I canceled Grammarly and didn't miss it since.

    You also can self-host and we do that at my workplace, because we deal with sensitive documents.

  • dewey 2 days ago

    Same, it integrates in all input fields too and has all the browser extensions you need. Non-GitHub landing page: https://languagetool.org

  • lou1306 2 days ago

    For VSCode users who want to try out LanguageTool, I cannot recommend the LTeX extension [1] highly enough. Setting up a self-hosted configuration is really easy and it integrates very neatly with the editor. It was originally built for LaTeX but also supports Markdown now.

    [1]: https://github.com/valentjn/vscode-ltex

  • isaacfrond 2 days ago

    And you can write your own custom rules. It's great as a reward for spotting an error in your writing you get to write a tiny little bit of code to spot it automatically next time. I've collected hundreds.

    • divan 2 days ago

      Is there a way to add and use niche custom terminology?

      • isaacfrond 2 days ago

        I've turned off the spell checker. Spell checking is done just fine in Word so I don't need it there.

      • herrherrmann 2 days ago

        You can add your own words to your account, if that’s what you mean!

  • shahzaibmushtaq 2 days ago

    How come I have never heard of languagetool before or maybe I have never looked beyond Grammerly. Thank You!

  • herrherrmann 2 days ago

    Absolutely plus one on this. LanguageTool is great and I’m also very happy on the free tier. With the app installed on macOS it also checks mails in the Apple Mail app, for example.

  • Semaphor 2 days ago

    This explains why I was confused by this. I moved to LT many, many years ago, and didn’t know about those new Grammarly features. So I really wasn’t clear how rewriting a specific text had anything to do with Grammarly.

  • ktosobcy 2 days ago

    This! And what's more - it doesn't funnel all what I type to OpenAI so I'd say it's more FOSS than this extension…

    • dspillett 2 days ago

      And if you are in a regulatory environment (or elsewhere where data exfiltration paranoia is part of your daily work life), you can install your own instance of the service (sans premium features) and not send your text anywhere outside infrastructure you control.

zlwaterfield 2 days ago

After years with Grammarly, I wanted a simpler, cheaper way to improve my writing. So I built Scramble, a Chrome extension that uses an LLM for writing enhancements.

Key features: - Uses your OpenAI API key (100% local) - Pre-defined prompts for various improvements - Highlight text and wait for suggestions - Currently fixed to GPT-4-turbo

Future plans: add LLM provider/model choice, custom prompts, bug fixes, and improve default prompts.

It's probably buggy, but I'll keep improving it. Feedback welcome.

GitHub: https://github.com/zlwaterfield/scramble

  • lhousa 2 days ago

    Rookie question: the openAPI endpoint costs extra right? Not something that comes with chatGPT or chatGPT+.

    • zlwaterfield 2 days ago

      Correct but I'm going to loom into a locally running LLM so it would be free.

      • Tepix 2 days ago

        Please do (assuming you mean "look"). When you add support for a custom API URL, please make sure it supports HTTP Basic authentication.

        That's super useful for people who run say ollama with an nginx reverse proxy in front of it (that adds authentication).

      • nickthegreek 2 days ago

        Look into allowing it to connect to either a LM Studio endpoint or ollama please.

    • Szpadel 2 days ago

      yes, but gpt-4o-mini costs very little so you probably will spend well under $1/month

      • miguelaeh 2 days ago

        I don't think the point here should be the cost, but the fact that you are sending everything you write to OpenAI to train their models on your information. The option of a local model allows you to preserve the privacy of what you write. I like that.

        • nickthegreek 2 days ago

          Openai does not train models on data that comes in from the API.

          https://openai.com/policies/business-terms/

          • punchmesan a day ago

            Assuming for the moment that they aren't saying that with their fingers crossed behind their back, that doesn't change the fact that they store the inputs they receive and swear they'll protect it (Paraphrasing from the Content section of the above link). Even if it's not fed back into the LLM, the fact that they store the inputs anywhere for a period of time is a huge privacy risk -- after all a breach is a matter of "when", not "if".

  • compootr 2 days ago

    how much does it cost in a normal day?

    • Tepix 2 days ago

      Don't think about money. Think about the cost in terms of forgone privacy.

      • compootr a day ago

        to protect your privacy from grammarly you fork over your data to openai?

    • pkhamre 2 days ago

      What is a normal day?

      • compootr a day ago

        like what he's spending on average.

        Maybe sending some emails, writing or proofreading some docs -- what you'd do in a business day

      • exe34 2 days ago

        a day when nothing too unusual happens.

  • xdennis 2 days ago

    > Key features: - Uses your OpenAI API key (100% local)

    Sorry, but we have a fundamental disagreement on terms here. Sending requests to OpenAI is not 100% local.

    The OpenAI API is not free or open source. By your definition, if you used the Grammarly API for this extension it would be a 100% local, open source alternative to Grammarly too.

    • zlwaterfield 2 days ago

      Agree, I want to add a local LLM set up. The wording there isn't great.

  • kylebenzle 2 days ago

    Without marketing speak can I ask why anyone would have a need for a service like grammerly, I always thought it was odd trying to sell a subscription based spell checker (AI is just a REALLY good spell checker).

    • gazereth 2 days ago

      Non-native speakers find it useful since it doesn't just fix spelling but also fixes correctness, directness, tone and tense. It gives you an indication of how your writing comes across, e.g. friendly, aggressive, assertive, polite.

      English can be a very nuanced language - easy to learn, difficult to master. Grammarly helps with that.

    • rlayton2 2 days ago

      I'm a big fan of Grammarly and have been using it, and paying for it, for years.

      The advantage is not spell checking. It is grammar and style improvements. It tells you things like "this language is informal", or "this is a better word for that".

    • mhuffman 2 days ago

      The "grammar" part, at least in a professional setting. You might be shocked at how many people will write an email pretty much like they would talk to friends at a club or send a text message (complete with emojis!) or just generally butcher professional correspondence.

      • dotancohen 2 days ago

        So it may be more attractive to employers to check their employees' output, rather than an individual checking his own?

        • oneeyedpigeon 2 days ago

          No, it's also useful to check your own writing. I've used it as both an Editor and a Writer.

    • socksy 2 days ago

      It is widely used in countries where the professional language is English, but the native language of the speakers is not.

      For example, most Slavic languages don't have the same definite/indefinite article system English does, which means that whilst someone could speak and write excellent English, the correct usage of "a" and "the" is a constant conscious struggle, where having a tool to check and correct your working is really useful. In Greek, word order is not so important. And so on.

      Spell check usually just doesn't cut it, and when it does (say, in Word), it usually isn't universally available.

      Personally, I have long wanted such a system for German, which I am not native in. Lucky for me DeepL launched a similar product with German support.

      A recent example for me was that I was universally using "bekommen" as a literal translation of "receive" in all sentences where I needed that word. Through DeepL I learned that the more appropriate word in a bunch of contexts is "erhalten", which is the sort of thing that I would never have got from a spell check.

      Grammarly is notably a Ukrainian founded company.

    • pbhjpbhj 2 days ago

      Without marketing speak, can I ask why anyone would have a need for a service like Grammarly?

          ---
      
      Manual corrections here, but maybe they give a clue?
      • robertlagrant 2 days ago

        They aren't a native English speaker and would like a hand with phrasing.

  • _HMCB_ 2 days ago

    This is awesome. Can’t wait to install it and put it through its paces.

  • TheRealPomax 2 days ago

    Does it work in "not a browser" though? Because that's the last place I need this, I really want this in Typora, VS Code, etc. instead.

    • zlwaterfield 2 days ago

      Not right now. Looking into a mac app. This was just a quick and dirty first go at it.

      • TheRealPomax a day ago

        Makes sense. Strongly hope it won't be a "mac app" but a cross-platform application instead though, nothing worse than having a great mac app that you can't use 50% of the time because your work computer's a mac and your personal computer's a windows machine because you like playing games.

remoquete 2 days ago

In the same space, I recommend checking out the Vale linter. Fairly powerful and open source, too. And doesn't rely on a backend.

https://vale.sh

  • loughnane 2 days ago

    I love vale. I’ve been using it for years. I branched rules from someone trying to emulate the economist style guide and kept tweaking.

    I like this approach so much better than leaning on AI because it’s more my “voice”.

    https://github.com/loughnane/style

aDyslecticCrow 2 days ago

Grammarly is a lifesaver for my day-to-day writing. All it does is correct spelling and punctuation or give rephrase suggestions. But Grammarly does it so unreasonably well that nothing else compares.

Grammarly's core functionality is not even LLM-based; it's older than that. Recently, they've crammed in some LLM features that I don't care a snoot about compared to its core functionality.

This tool, like any other "Grammarly alternative," is just another GPT wrapper to rewrite my text in an overly verbose and soulless way. I was hoping for a halfway-decent spelling corrector.

  • funshed a day ago

    Absolutely! Being dyslexic, Grammarly is much more than the AI tool that was recently added, which is great, too.

vunderba 2 days ago

Nice job—I'm always a fan of 'bring your own key' (BYOK) approaches. I think there's a lot of potential in using LLMs as virtual copy editors.

I do a fair amount of writing and have actually put together several custom GPTs, each with varying degrees of freedom to rewrite the text.

The first one acts strictly as a professional editor—it's allowed to fix spelling errors, grammatical issues, word repetition, etc., but it has to preserve the original writing style.

I do a lot of dictation while I walk my husky, so when I get back home, I can run whisper, convert the audio to text, and throw it at the GPT. It cleans it up, structures it into paragraphs, etc. Between whisper/GPT, it saves me hours of busy work.

The other one is allowed to restructure the text, fix continuity errors, replace words to ensure a more professional tone, and improve the overall flow. This one is more reserved for public communique such as business related emails.

  • edweis 2 days ago

    > I'm always a fan of 'bring your own key' (BYOK) approaches.

    "Bring your own key" has the same amount of syllables as "BYOK"

    • closetkantian 2 days ago

      If your point is that BYOK is a useless acronym since it has the same number* of syllables, I disagree. Acronyms aren't just for reducing syllable count; they also reduce visual clutter and are easier to read for people who scan text.

      • pixelpoet a day ago

        My brother from another mother, I thought I was the only one left who distinguishes much from many. (I wish I didn't know that it's technically an initialism not an acronym...)

        • closetkantian 17 hours ago

          Hahaha, this comment has me thinking about how I would pronounce it. Bee-yok? Bye-yolk?

  • copperx 2 days ago

    I do something similar. I have a custom Gemini Gem that critiques my writing and points out how I can better my paragraphs, but I do the bulk of the rewriting myself.

    I'm not a native speaker, and the nice thing about this approach is that I seem to be learning to write better instead of just delegating the task to the machine.

  • thankyoufriend 2 days ago

    Very cool! I'd be interested in reading more about your dictation-to-text process if you documented it somewhere, thanks.

    My partner and I were just talking about how useful that would be, especially driving in the car when all of the "we should..." thoughts come out of hiding. Capturing those action items more organically without destroying the flow of the conversation would be heavenly.

chilipepperhott 2 days ago

While Scramble doesn't seem to respect your privacy, a project I've been working on does.

Meet Harper https://github.com/elijah-potter/harper

  • singhrac a day ago

    I think Harper is very cool, and you should sell it better. It's a local-only low latency & static (no Python) LanguageTool alternative. It doesn't use a large language model.

polemic 2 days ago

Seems a stretch to call it open source.

  • WA 2 days ago

    Seems a stretch to call it "more privacy-friendly" if it talks to OpenAI.

  • insane_dreamer 2 days ago

    Disagree. The fact that it can call another closed-source service doesn't mean that this tool itself is not open source.

  • senko 2 days ago

    The source seems to be at the linked repo, and the license is MIT. How’s that a stretch?

    • trog 2 days ago

      > The source seems to be at the linked repo, and the license is MIT. How’s that a stretch?

      Speaking for myself, I clicked on this thinking it might be open source in the sense of something I can run fully locally, like with a small grammar-only model.

      • n_plus_1_acc 2 days ago

        Check out languagetool, as mentioned in other comments. It isbtruly open source

    • latexr 2 days ago

      Because it’s a wrapper on a closed-source system.

      Imagine writing a shell script that cuts and converts video by calling ffmpeg, would you say it was “a video converter written in bash”? No, the important part would not be in bash, that’s just the thin wrapper used to call the tool and could be in any language. Meaning it would be useless to anyone who e.g. worked on a constrained system where they are not allowed to install any binaries.

      Same thing here. If you only run open-source software for privacy reasons, sending all your program data to some closed server you don’t control doesn’t address your issue. There’s no meaningful difference between making an open-source plugin that calls an OpenAI API and one that calls a Grammarly API.

      • guappa 2 days ago

        I've seen posts of "js interpreter written in 1 line" that was just a script calling node…

    • TheDong 2 days ago

      Code is only copyrightable if it has any element of creativity.

      This repo is _only_ really 7 sentences, like "Please correct spelling mistakes in the following text: " (these https://github.com/zlwaterfield/scramble/blob/2c1d9ebbd6b935...)

      Everything else is uncreative, and possibly un-copyrightable, boilerplate to send those sentences to OpenAI.

      All of the creative software happens on OpenAI's servers using proprietary code.

      • too_damn_fast 2 days ago

        Why would you even say 'please' in a prompt ?

        • t-writescode 2 days ago

          There has been evidence that better responses are sometimes provided with politeness for some LLMs.

          And some people just try to be polite and it only costs a couple tokens.

          • chaosist a day ago

            I use to say please/thank you to gpt4 in 2023 all the time but it was because I was completely anthropomorphizing the model in various ways.

            I suspect it would be just as easy to write a paper that saying please has absolutely no effect on the output. I feel like gpt4 is/was stochastically better on some days and at some hours than others. That might even be wrong though too. The idea that it is provable that "please" has a positive effect on the output is most likely a ridiculous idea.

    • dotancohen 2 days ago

      The MIT licensed code is a wrapper for the OpenAI API. That OpenAI API provides the core functionality, and it is not open source.

    • xdennis 2 days ago

      The entire codebase is one call to `api.openai.com`.

      If I sold you an electrical generator, but the way it worked was by plugging it in, would you say it's fair to say it's a generator?

nucleartux 2 days ago

I made the same thing, but it works without ChatGPT key: https://github.com/nucleartux/ai-grammar/

  • creesch 2 days ago

    That looks pretty neat, how well does the gemini nano model work for this? Is it just picking up spelling errors or also looking things like punctuation?

    • nucleartux 2 days ago

      It actually works pretty well. It fixes all grammar mistakes and punctuation and changes words if they don’t fit. The only downside is that, because it’s a very small model, it sometimes produces completely nonsensical or incomplete responses. I haven’t figured out how to fix this yet.

      You can have a look at the screenshots in the repository or on the store page.

  • Tepix 2 days ago

    Nice. Can you please add support for contacting your own private OpenAI compatible server (like ollama)?

rafram 2 days ago

Grammarly grammar checking predates modern LLMs by many years, so I assume they’re actually using some kind of rule-based engine internally.

  • tiew9Vii 2 days ago

    I was a big fan of Grammarly, as dyslexic, so often write the wrong word then ten minutes later when re-reading spot i used the wrong word/spelling etc.

    It worked extremely well, as you say I think by using basic rules engines.

    I’ve canceled my subscription recently as found it getting worse, not better, I suspect because they are now applying LLMs.

    The suggestions started to make less sense and the problem with LLM suggestions is all your writing takes the tone of the LLM, you loose your personality/style in what you write.

    The basic rules approach worked much better for me.

  • TheRealPomax 2 days ago

    This pretends that LLMs aren't just "more machine leearning", which they simply are.

conradklnspl 2 days ago

How does this compare to https://languagetool.org, which is also open source?

I'm not sure what kind of AI Languagetool uses but it works really well!

bartread 2 days ago

> It's designed to be a more customizable and privacy-respecting alternative to Grammarly.

Kind of a shame it says it’s specifically for Chrome then. Where’s the love for Firefox?

  • daef 2 days ago

    upping this - I won't install chrome :)

halJordan 2 days ago

Seems like it just has some prebaked prompts right now. FF's AI integration does this much already with custom prompts and custom providers. Pls let me set my own base url. So many tools already support the openai api.

All of that to say, this is of course a great addition to the ecosystem.

ichik 2 days ago

For me the huge part of Grammarly's magic is that it's not just in the browser, but in any text input on desktop with their desktop app (with some exceptions). Having it only in only in one application just doesn't cut it, especially since it's not my browser of choice. Are there any plans regarding desktop integration. Linux is woefully underserved in this space with all major offerings (Grammarly, Languagetool) having only macOS/Windows versions.

  • bukacdan 2 days ago

    I have developed a system-wide writing assistant like you're describing. By design, it has no exceptions to where it works.

    Currently, it's only for Mac, but I'm working on an Electron version too (though it's quite challenging).

    Check out https://steerapp.ai/

    • ichik a day ago

      Is the Electron version supposed to be available on Linux? I see only mentions of Windows on the website.

kirso 8 hours ago

Is there something like that for VSCode?

grayxu 2 days ago

One strong point of Grammarly comes from its friendly display of diffs (which is somewhat similar to what Cursor does). This project simply uses some predefined prompts to generate text and then replaces it. There are countless plugins that can achieve this, such as the OpenAI translator.

If this tool really wants to compete with Grammarly.

miguelaeh 2 days ago

I am a Grammarly user and I just installed Scramble to try it out. However, it does not seem to work. When I click on any of the options, nothing happens. I use Ubuntu 22.04.

Also, to provide some feedback, it would be awesome to make it automatically appear on the text areas and highlight errors like Grammarly does, it creates a much better UX.

  • zlwaterfield 2 days ago

    Agree - I want to improve the UX, this was just a quick attempt at it. Thanks for the feedback!

    • miguelaeh 2 days ago

      You're welcome! Let me know if you plan to integrate local models as mentioned in other comments, I am working on something to make it transparent.

raverbashing 2 days ago

> open-source Chrome extension

> It's designed to be a more customizable and privacy-respecting alternative to Grammarly.

> This extension requires an OpenAI API key to function

I disagree with this description of the service

No, it's not an "Open Source alternative to grammarly", it's an OpenAI wrapper

  • 8f2ab37a-ed6c 2 days ago

    Wonder if there's an option to somehow pipe the prompting to a local ollama instead.

    • raverbashing 2 days ago

      That would be an interesting possibility

  • zlwaterfield 2 days ago

    Agree, wording could be improved. I'm gonna add local LLM support.

shahzaibmushtaq 2 days ago

Grammarly was here before the AI boom, so Grammarly isn't just dependent on AI, but also heavily on HI.

gaiagraphia 2 days ago

>Important: This extension requires an OpenAI API key to function. You need to provide your own API key in the extension settings. Please visit OpenAI to obtain an API key.

Obviously not important enough to put in the title, or a submission statement here, though. Curious.

  • zlwaterfield 2 days ago

    Honestly just an oversight. I want to remove that dependancy anyways with an open source model.

lvl155 2 days ago

I am building something similar to Grammarly as a personal project but quickly realized how hard it is to get data in 2024. Contemplating whether I should just resort to pirated data which is just sad.

  • highcountess 2 days ago

    I’m just going to remind everyone that all these LLMs were also trained on not just pirated, but all out stolen data in organized and resourced assaults on proprietary information/data, not even to mention roughshod ignoring any and all licenses.

mobscenez 2 days ago

That's awesome, Grammarly is good but not as good as large language models such as GPT-4. I have been waiting for a tool that incorporates LLMs into grammar checks for a long time and here it comes! Hope it can integrate Anthropic API in the near future.

isaacfrond 2 days ago

Nowadays I just load the whole thing in to chatgpt and it checks the whole thing better than I ever could. You got to be clear what you want do in the prompt. Don't change my writing! only correct errors.

037 2 days ago

An alternative from the developer of Coolify. It’s no longer for sale, but the page mentions he’ll open-source it:

https://safetyper.com/

ofou 2 days ago

Loved it. I'd love to use something like "right-click, fix grammar" under iOS—not just rewrite. I want to keep my own voice, just with minimal conformant grammar as a second-language speaker.

  • rafram 2 days ago

    AFAIK Apple Intelligence will include essentially that.

    • ofou 2 days ago

      let's hope the rumors are real

ziddoap 2 days ago

Privacy.md needs to be updated.

>If you have any questions about this privacy policy, please contact us at [your contact information].

HL33tibCe7 2 days ago

This is exactly as open source as a Chrome extension wrapping Grammarly’s API would be, i.e. not at all.

janandonly 2 days ago

I am currently paying for LaguageTool but I will definitely give this open source software a try !

the_arun 2 days ago

Do we need OpenAI for this? Can’t we have an LLM sitting locally work?

nik736 2 days ago

How is it more privacy respecting when it's sending stuff to OpenAI servers?

reynaldi 2 days ago

Awesome, I was just about to look for something like this and it showed up on HN!

reify 2 days ago

I also use LanguageTool

easy to install in LibreOffice

Festro 2 days ago

So it doesn't provide realtime feedback on your writing within a dialog box like Grammarly does? It's just a (non-open source) OpenAI set of pre-written prompts?

Come on.

Pitch this honestly. It'll save me clicks if I'm using an LLM to checker grammar already, but if I use Grammarly it's not an alternative at all. Not by a long way.

lccerina 2 days ago

It uses OpenAI, so it's not open source. Keep this shit away from me.

mproud a day ago

F*ck Grammarly.