Technetium a day ago | next |

They proclaim "privacy-respecting" but all your keystrokes go to OpenAI. Horrific and genuinely upsetting.

Edit: The author replied to another comment that there is an intent to add local AI. If that is the plan, then fix the wording until it can actually be considered privacy-respecting: https://news.ycombinator.com/item?id=41579144

charlie0 21 hours ago | root | parent | next |

Lol, this was my second thought immediately after my first, which was one of excitement. Hope the author does add a option for local. Wonder how that would work as a Chrome extension. Doesn't seem like a good idea for extensions to be accessing local resources though.

mdaniel 19 hours ago | root | parent | next |

> Doesn't seem like a good idea for extensions to be accessing local resources though.

To the best of my knowledge all localhost connections are exempt from CORS and that's in fact how the 1Password extension communicates with the desktop app. I'd bet Bitwarden and KeePassXC behave similarly

fph 13 hours ago | root | parent | prev |

You can self-host Languagetool and use it as a Chrome/Firefox extension. The extension talks to a Languagetool server via HTTP, and takes its address as a configurable option. So you just run the local server, and pass localhost:8080 as the server address.

Eisenstein 18 hours ago | root | parent | prev |

Download koboldcpp and llama3.1 gguf weights, use it with the llama3 completions adapter.

Edit the 'background.js' file in the extension and replace the openAI endpoint with

'http://your.local.ip.addr:5001/v1/chat/completions'

Set anything you want as an API key. Now you have a truly local version.

* https://github.com/LostRuins/koboldcpp/releases

* https://huggingface.co/bartowski/Meta-Llama-3.1-8B-Instruct-...

* https://github.com/LostRuins/koboldcpp/blob/concedo/kcpp_ada...

Alex4386 a day ago | prev | next |

People really should stop calling a glorified openAI API as an open-source software.

jillesvangurp a day ago | root | parent | next |

There are several free alternatives to OpenAI that use the same API; which would make it possible to substitute OpenAI for one of those models in this extension. At least on paper. There is an open issue on the github repository requesting something like that.

So, it's not as clear cut. The general approach of using LLMs for this is not a bad one; LLMs are pretty good at this stuff.

dotancohen a day ago | root | parent |

Yes, but the API at the end is providing the core functionality. Simply swapping out one LLM model for another - let alone by a different company altogether - will completely change the effectiveness and usefulness of the application.

Tepix a day ago | root | parent | next |

Well, as we see with AI applications like "Leo AI" and "Continue", using a locally run LLM can be fantastic replacements for proprietary offerings.

dartos a day ago | root | parent |

FWIW I’ve found local models to be essentially useless for coding tasks.

Tepix 20 hours ago | root | parent |

Really? Maybe your models are too small?

spmurrayzzz 19 hours ago | root | parent | next |

The premier open weight models don't even comparatively perform well on the public benchmarks compared to frontier models. And that's assuming at least some degree of benchmark contamination for the open weight models.

While I don't think they're completely useless (though its close), calling them fantastic replacements feels like an egregious overstatement of their value.

EDIT: Also wanted to note that I think this becomes as much an expectations-setting exercise as it is evaluation on raw programming performance. Some people are incredibly impressed by the ability to assist in building simple web apps, others not so much. Experience will vary across that continuum.

dartos 7 hours ago | root | parent |

Yeah, in my comparing deepseek coder 2 lite (the best coding model I can find that’ll run on my 4090) to Claud sonnet under aider…

Deep seek lite was essentially useless. Too slow and too low quality edits.

I’ve been programming for about 17 years, so the things I want aider to do are a little more specific than building simple web apps. Larger models are just better at it.

I can run the full deepseek coder model on some cloud and probably get very acceptable results, but then it’s no longer local.

dartos a day ago | root | parent | prev | next |

One would hope, that since the problem these models are trying to solve is language modeling, they would eventually converge around similar capabilities

JCharante a day ago | root | parent | prev |

everyone stands on the shoulders of giants.

sham1 a day ago | root | parent | next |

Things standing on the shoulders of proprietary giants shouldn't claim to be free software/open source.

t-writescode a day ago | root | parent |

Their interfacing software __is__ open source; and, they're asking for your OpenAI api key to operate. I would expect / desire open source code if I were to use that, so I could be sure my api key was only being used for my work, so it's only my work that I'm paying for and it's not been stolen in some way.

noduerme a day ago | root | parent | prev |

My older brother who got me into coding learned to code in Assembly. He doesn't really consider most of my work writing in high level languages to be "coding". So maybe there's something here. But if I had to get into the underlying structure, I could. I do wonder whether the same can be said for people who just kludge together a bunch of APIs that produce magical result sets.

dotancohen a day ago | root | parent |

  > But if I had to get into the underlying structure, I could.
How do you propose to get into the underlying structure of the OpenAPI API? Breach their network and steal their code and models? I don't understand what you're arguing.

latexr a day ago | root | parent | next |

> How do you propose to get into the underlying structure of the OpenAPI API?

The fact that you can’t is the point of the comment. You could get into the underlying structure of other things, like the C interpreter of a scripting language.

seadan83 a day ago | root | parent | prev | next |

I think the argument is that stitching things together at a high level is not really coding. A bit of a no true scotsmen perspective. The example is that anything more abstract than assembly is not even true coding, let alone creating a wrapper layer around an LLM

K0balt a day ago | root | parent | prev |

I think the relevant analogy here would be to run a local model. There are several tools to easily run local models for a local API. I run a 70b finetune with some tool use locally on our farm, and it is accessible to all users as a local openAI alternative. For most applications it is adequate and data stays on the campus area network.

noduerme 3 hours ago | root | parent |

A more accurate analogy would be, are you capable of finding and correcting errors in the model at the neural level if necessary? Do you have an accurate mental picture of how it performs its tasks, in a way that allows you to predictably control its output, if not actually modify it? If not, you're mostly smashing very expensive matchbox cars together, rather than doing anything resembling programming.

slg a day ago | prev | next |

I have been using LanguageTool[1] for years as "an open source alternative to [old school] Grammarly". It doesn't do that fancy "make this text more professional" AI stuff like this or Grammarly can now do, but they offer a self-hosted version so you don't need to send everything you write to OpenAI. If all you want is a better spelling/grammar checker, I highly recommend it.

[1] - https://github.com/languagetool-org/languagetool

dspillett a day ago | root | parent | next |

You can also run your own local instance for the in-browser checking, which is handy for me as I need to be careful about sending text off to another company in another country (due to both client security requirements and personal paranoia!).

You don't get the AI based extras like paraphrasing, and the other bits listed in as premium only (https://languagetool.org/premium_new), but if you install the n-gram DB for your language (https://languagetool.org/download/ngram-data/) I found it at least as good as, for some examples better than, Grammarly's free offering last time I did a comparison.

weinzierl a day ago | root | parent | prev | next |

It's great. I had a subscription for Grammarly for a couple of years and used both tools in parallel, but found myself mostly using languagetool increasingly. It is strictly better, I'd say even for English but certainly if you need other languages or deal with multilingual documents. So I canceled Grammarly and didn't miss it since.

You also can self-host and we do that at my workplace, because we deal with sensitive documents.

lou1306 a day ago | root | parent | prev | next |

For VSCode users who want to try out LanguageTool, I cannot recommend the LTeX extension [1] highly enough. Setting up a self-hosted configuration is really easy and it integrates very neatly with the editor. It was originally built for LaTeX but also supports Markdown now.

[1]: https://github.com/valentjn/vscode-ltex

isaacfrond a day ago | root | parent | prev | next |

And you can write your own custom rules. It's great as a reward for spotting an error in your writing you get to write a tiny little bit of code to spot it automatically next time. I've collected hundreds.

herrherrmann a day ago | root | parent | prev | next |

Absolutely plus one on this. LanguageTool is great and I’m also very happy on the free tier. With the app installed on macOS it also checks mails in the Apple Mail app, for example.

Semaphor a day ago | root | parent | prev | next |

This explains why I was confused by this. I moved to LT many, many years ago, and didn’t know about those new Grammarly features. So I really wasn’t clear how rewriting a specific text had anything to do with Grammarly.

ktosobcy a day ago | root | parent | prev | next |

This! And what's more - it doesn't funnel all what I type to OpenAI so I'd say it's more FOSS than this extension…

dspillett a day ago | root | parent |

And if you are in a regulatory environment (or elsewhere where data exfiltration paranoia is part of your daily work life), you can install your own instance of the service (sans premium features) and not send your text anywhere outside infrastructure you control.

zlwaterfield a day ago | prev | next |

After years with Grammarly, I wanted a simpler, cheaper way to improve my writing. So I built Scramble, a Chrome extension that uses an LLM for writing enhancements.

Key features: - Uses your OpenAI API key (100% local) - Pre-defined prompts for various improvements - Highlight text and wait for suggestions - Currently fixed to GPT-4-turbo

Future plans: add LLM provider/model choice, custom prompts, bug fixes, and improve default prompts.

It's probably buggy, but I'll keep improving it. Feedback welcome.

GitHub: https://github.com/zlwaterfield/scramble

lhousa a day ago | root | parent | next |

Rookie question: the openAPI endpoint costs extra right? Not something that comes with chatGPT or chatGPT+.

zlwaterfield a day ago | root | parent | next |

Correct but I'm going to loom into a locally running LLM so it would be free.

Tepix a day ago | root | parent | next |

Please do (assuming you mean "look"). When you add support for a custom API URL, please make sure it supports HTTP Basic authentication.

That's super useful for people who run say ollama with an nginx reverse proxy in front of it (that adds authentication).

Szpadel a day ago | root | parent | prev |

yes, but gpt-4o-mini costs very little so you probably will spend well under $1/month

miguelaeh a day ago | root | parent |

I don't think the point here should be the cost, but the fact that you are sending everything you write to OpenAI to train their models on your information. The option of a local model allows you to preserve the privacy of what you write. I like that.

nickthegreek 21 hours ago | root | parent |

Openai does not train models on data that comes in from the API.

https://openai.com/policies/business-terms/

punchmesan 19 hours ago | root | parent |

Assuming for the moment that they aren't saying that with their fingers crossed behind their back, that doesn't change the fact that they store the inputs they receive and swear they'll protect it (Paraphrasing from the Content section of the above link). Even if it's not fed back into the LLM, the fact that they store the inputs anywhere for a period of time is a huge privacy risk -- after all a breach is a matter of "when", not "if".

compootr a day ago | root | parent | prev | next |

how much does it cost in a normal day?

kylebenzle a day ago | root | parent | prev | next |

Without marketing speak can I ask why anyone would have a need for a service like grammerly, I always thought it was odd trying to sell a subscription based spell checker (AI is just a REALLY good spell checker).

gazereth a day ago | root | parent | next |

Non-native speakers find it useful since it doesn't just fix spelling but also fixes correctness, directness, tone and tense. It gives you an indication of how your writing comes across, e.g. friendly, aggressive, assertive, polite.

English can be a very nuanced language - easy to learn, difficult to master. Grammarly helps with that.

rlayton2 a day ago | root | parent | prev | next |

I'm a big fan of Grammarly and have been using it, and paying for it, for years.

The advantage is not spell checking. It is grammar and style improvements. It tells you things like "this language is informal", or "this is a better word for that".

mhuffman a day ago | root | parent | prev | next |

The "grammar" part, at least in a professional setting. You might be shocked at how many people will write an email pretty much like they would talk to friends at a club or send a text message (complete with emojis!) or just generally butcher professional correspondence.

socksy a day ago | root | parent | prev | next |

It is widely used in countries where the professional language is English, but the native language of the speakers is not.

For example, most Slavic languages don't have the same definite/indefinite article system English does, which means that whilst someone could speak and write excellent English, the correct usage of "a" and "the" is a constant conscious struggle, where having a tool to check and correct your working is really useful. In Greek, word order is not so important. And so on.

Spell check usually just doesn't cut it, and when it does (say, in Word), it usually isn't universally available.

Personally, I have long wanted such a system for German, which I am not native in. Lucky for me DeepL launched a similar product with German support.

A recent example for me was that I was universally using "bekommen" as a literal translation of "receive" in all sentences where I needed that word. Through DeepL I learned that the more appropriate word in a bunch of contexts is "erhalten", which is the sort of thing that I would never have got from a spell check.

Grammarly is notably a Ukrainian founded company.

xdennis a day ago | root | parent | prev | next |

> Key features: - Uses your OpenAI API key (100% local)

Sorry, but we have a fundamental disagreement on terms here. Sending requests to OpenAI is not 100% local.

The OpenAI API is not free or open source. By your definition, if you used the Grammarly API for this extension it would be a 100% local, open source alternative to Grammarly too.

TheRealPomax a day ago | root | parent | prev |

Does it work in "not a browser" though? Because that's the last place I need this, I really want this in Typora, VS Code, etc. instead.

zlwaterfield a day ago | root | parent |

Not right now. Looking into a mac app. This was just a quick and dirty first go at it.

TheRealPomax 19 hours ago | root | parent |

Makes sense. Strongly hope it won't be a "mac app" but a cross-platform application instead though, nothing worse than having a great mac app that you can't use 50% of the time because your work computer's a mac and your personal computer's a windows machine because you like playing games.

aDyslecticCrow 21 hours ago | prev | next |

Grammarly is a lifesaver for my day-to-day writing. All it does is correct spelling and punctuation or give rephrase suggestions. But Grammarly does it so unreasonably well that nothing else compares.

Grammarly's core functionality is not even LLM-based; it's older than that. Recently, they've crammed in some LLM features that I don't care a snoot about compared to its core functionality.

This tool, like any other "Grammarly alternative," is just another GPT wrapper to rewrite my text in an overly verbose and soulless way. I was hoping for a halfway-decent spelling corrector.

funshed 19 hours ago | root | parent |

Absolutely! Being dyslexic, Grammarly is much more than the AI tool that was recently added, which is great, too.

vunderba a day ago | prev | next |

Nice job—I'm always a fan of 'bring your own key' (BYOK) approaches. I think there's a lot of potential in using LLMs as virtual copy editors.

I do a fair amount of writing and have actually put together several custom GPTs, each with varying degrees of freedom to rewrite the text.

The first one acts strictly as a professional editor—it's allowed to fix spelling errors, grammatical issues, word repetition, etc., but it has to preserve the original writing style.

I do a lot of dictation while I walk my husky, so when I get back home, I can run whisper, convert the audio to text, and throw it at the GPT. It cleans it up, structures it into paragraphs, etc. Between whisper/GPT, it saves me hours of busy work.

The other one is allowed to restructure the text, fix continuity errors, replace words to ensure a more professional tone, and improve the overall flow. This one is more reserved for public communique such as business related emails.

edweis a day ago | root | parent | next |

> I'm always a fan of 'bring your own key' (BYOK) approaches.

"Bring your own key" has the same amount of syllables as "BYOK"

closetkantian a day ago | root | parent |

If your point is that BYOK is a useless acronym since it has the same number* of syllables, I disagree. Acronyms aren't just for reducing syllable count; they also reduce visual clutter and are easier to read for people who scan text.

pixelpoet 14 hours ago | root | parent |

My brother from another mother, I thought I was the only one left who distinguishes much from many. (I wish I didn't know that it's technically an initialism not an acronym...)

copperx a day ago | root | parent | prev | next |

I do something similar. I have a custom Gemini Gem that critiques my writing and points out how I can better my paragraphs, but I do the bulk of the rewriting myself.

I'm not a native speaker, and the nice thing about this approach is that I seem to be learning to write better instead of just delegating the task to the machine.

thankyoufriend a day ago | root | parent | prev |

Very cool! I'd be interested in reading more about your dictation-to-text process if you documented it somewhere, thanks.

My partner and I were just talking about how useful that would be, especially driving in the car when all of the "we should..." thoughts come out of hiding. Capturing those action items more organically without destroying the flow of the conversation would be heavenly.

polemic a day ago | prev | next |

Seems a stretch to call it open source.

senko a day ago | root | parent | prev |

The source seems to be at the linked repo, and the license is MIT. How’s that a stretch?

trog a day ago | root | parent | next |

> The source seems to be at the linked repo, and the license is MIT. How’s that a stretch?

Speaking for myself, I clicked on this thinking it might be open source in the sense of something I can run fully locally, like with a small grammar-only model.

latexr a day ago | root | parent | prev | next |

Because it’s a wrapper on a closed-source system.

Imagine writing a shell script that cuts and converts video by calling ffmpeg, would you say it was “a video converter written in bash”? No, the important part would not be in bash, that’s just the thin wrapper used to call the tool and could be in any language. Meaning it would be useless to anyone who e.g. worked on a constrained system where they are not allowed to install any binaries.

Same thing here. If you only run open-source software for privacy reasons, sending all your program data to some closed server you don’t control doesn’t address your issue. There’s no meaningful difference between making an open-source plugin that calls an OpenAI API and one that calls a Grammarly API.

guappa a day ago | root | parent |

I've seen posts of "js interpreter written in 1 line" that was just a script calling node…

TheDong a day ago | root | parent | prev | next |

Code is only copyrightable if it has any element of creativity.

This repo is _only_ really 7 sentences, like "Please correct spelling mistakes in the following text: " (these https://github.com/zlwaterfield/scramble/blob/2c1d9ebbd6b935...)

Everything else is uncreative, and possibly un-copyrightable, boilerplate to send those sentences to OpenAI.

All of the creative software happens on OpenAI's servers using proprietary code.

too_damn_fast a day ago | root | parent |

Why would you even say 'please' in a prompt ?

t-writescode a day ago | root | parent |

There has been evidence that better responses are sometimes provided with politeness for some LLMs.

And some people just try to be polite and it only costs a couple tokens.

chaosist 16 hours ago | root | parent |

I use to say please/thank you to gpt4 in 2023 all the time but it was because I was completely anthropomorphizing the model in various ways.

I suspect it would be just as easy to write a paper that saying please has absolutely no effect on the output. I feel like gpt4 is/was stochastically better on some days and at some hours than others. That might even be wrong though too. The idea that it is provable that "please" has a positive effect on the output is most likely a ridiculous idea.

dotancohen a day ago | root | parent | prev | next |

The MIT licensed code is a wrapper for the OpenAI API. That OpenAI API provides the core functionality, and it is not open source.

xdennis a day ago | root | parent | prev |

The entire codebase is one call to `api.openai.com`.

If I sold you an electrical generator, but the way it worked was by plugging it in, would you say it's fair to say it's a generator?

nucleartux a day ago | prev | next |

I made the same thing, but it works without ChatGPT key: https://github.com/nucleartux/ai-grammar/

creesch a day ago | root | parent | next |

That looks pretty neat, how well does the gemini nano model work for this? Is it just picking up spelling errors or also looking things like punctuation?

nucleartux a day ago | root | parent |

It actually works pretty well. It fixes all grammar mistakes and punctuation and changes words if they don’t fit. The only downside is that, because it’s a very small model, it sometimes produces completely nonsensical or incomplete responses. I haven’t figured out how to fix this yet.

You can have a look at the screenshots in the repository or on the store page.

rafram a day ago | prev | next |

Grammarly grammar checking predates modern LLMs by many years, so I assume they’re actually using some kind of rule-based engine internally.

tiew9Vii a day ago | root | parent | prev | next |

I was a big fan of Grammarly, as dyslexic, so often write the wrong word then ten minutes later when re-reading spot i used the wrong word/spelling etc.

It worked extremely well, as you say I think by using basic rules engines.

I’ve canceled my subscription recently as found it getting worse, not better, I suspect because they are now applying LLMs.

The suggestions started to make less sense and the problem with LLM suggestions is all your writing takes the tone of the LLM, you loose your personality/style in what you write.

The basic rules approach worked much better for me.

conradklnspl a day ago | prev | next |

How does this compare to https://languagetool.org, which is also open source?

I'm not sure what kind of AI Languagetool uses but it works really well!

patrakov a day ago | root | parent |

LanguageTool is not open source; it is open core. There are proprietary "premium rules," and you won't get them in a self-hosted version.

dns_snek a day ago | root | parent | prev |

Self hosted & open core seems distinctly better than an open wrapper around a black box core that's hosted by a 3rd party.

bartread a day ago | prev | next |

> It's designed to be a more customizable and privacy-respecting alternative to Grammarly.

Kind of a shame it says it’s specifically for Chrome then. Where’s the love for Firefox?

halJordan a day ago | prev | next |

Seems like it just has some prebaked prompts right now. FF's AI integration does this much already with custom prompts and custom providers. Pls let me set my own base url. So many tools already support the openai api.

All of that to say, this is of course a great addition to the ecosystem.

ichik a day ago | prev | next |

For me the huge part of Grammarly's magic is that it's not just in the browser, but in any text input on desktop with their desktop app (with some exceptions). Having it only in only in one application just doesn't cut it, especially since it's not my browser of choice. Are there any plans regarding desktop integration. Linux is woefully underserved in this space with all major offerings (Grammarly, Languagetool) having only macOS/Windows versions.

bukacdan a day ago | root | parent |

I have developed a system-wide writing assistant like you're describing. By design, it has no exceptions to where it works.

Currently, it's only for Mac, but I'm working on an Electron version too (though it's quite challenging).

Check out https://steerapp.ai/

ichik 19 hours ago | root | parent |

Is the Electron version supposed to be available on Linux? I see only mentions of Windows on the website.

grayxu 20 hours ago | prev | next |

One strong point of Grammarly comes from its friendly display of diffs (which is somewhat similar to what Cursor does). This project simply uses some predefined prompts to generate text and then replaces it. There are countless plugins that can achieve this, such as the OpenAI translator.

If this tool really wants to compete with Grammarly.

miguelaeh a day ago | prev | next |

I am a Grammarly user and I just installed Scramble to try it out. However, it does not seem to work. When I click on any of the options, nothing happens. I use Ubuntu 22.04.

Also, to provide some feedback, it would be awesome to make it automatically appear on the text areas and highlight errors like Grammarly does, it creates a much better UX.

zlwaterfield a day ago | root | parent |

Agree - I want to improve the UX, this was just a quick attempt at it. Thanks for the feedback!

miguelaeh a day ago | root | parent |

You're welcome! Let me know if you plan to integrate local models as mentioned in other comments, I am working on something to make it transparent.

raverbashing a day ago | prev | next |

> open-source Chrome extension

> It's designed to be a more customizable and privacy-respecting alternative to Grammarly.

> This extension requires an OpenAI API key to function

I disagree with this description of the service

No, it's not an "Open Source alternative to grammarly", it's an OpenAI wrapper

gaiagraphia a day ago | prev | next |

>Important: This extension requires an OpenAI API key to function. You need to provide your own API key in the extension settings. Please visit OpenAI to obtain an API key.

Obviously not important enough to put in the title, or a submission statement here, though. Curious.

lvl155 a day ago | prev | next |

I am building something similar to Grammarly as a personal project but quickly realized how hard it is to get data in 2024. Contemplating whether I should just resort to pirated data which is just sad.

highcountess a day ago | root | parent | prev |

I’m just going to remind everyone that all these LLMs were also trained on not just pirated, but all out stolen data in organized and resourced assaults on proprietary information/data, not even to mention roughshod ignoring any and all licenses.

mobscenez a day ago | prev | next |

That's awesome, Grammarly is good but not as good as large language models such as GPT-4. I have been waiting for a tool that incorporates LLMs into grammar checks for a long time and here it comes! Hope it can integrate Anthropic API in the near future.

isaacfrond a day ago | prev | next |

Nowadays I just load the whole thing in to chatgpt and it checks the whole thing better than I ever could. You got to be clear what you want do in the prompt. Don't change my writing! only correct errors.

ziddoap 21 hours ago | prev | next |

Privacy.md needs to be updated.

>If you have any questions about this privacy policy, please contact us at [your contact information].

ofou a day ago | prev | next |

Loved it. I'd love to use something like "right-click, fix grammar" under iOS—not just rewrite. I want to keep my own voice, just with minimal conformant grammar as a second-language speaker.

janandonly a day ago | prev | next |

I am currently paying for LaguageTool but I will definitely give this open source software a try !

HL33tibCe7 a day ago | prev | next |

This is exactly as open source as a Chrome extension wrapping Grammarly’s API would be, i.e. not at all.

Festro a day ago | prev | next |

So it doesn't provide realtime feedback on your writing within a dialog box like Grammarly does? It's just a (non-open source) OpenAI set of pre-written prompts?

Come on.

Pitch this honestly. It'll save me clicks if I'm using an LLM to checker grammar already, but if I use Grammarly it's not an alternative at all. Not by a long way.