Show HN: Machato — A native macOS client for ChatGPT
untimelyunicorn.gumroad.comMachato is a tiny ChatGPT client that supports all basic features, and more !
I was frustrated with other native implementations that focused on quick access but didn't give easy access to a history of what has been said. Since ChatGPT is a learning tool for me, I'm always ruffling through past conversations. The client supports markdown rendering as well as LaTeX. Feel free to try it out !
For those who see this thread early, you can use the promo code EARLY_BIRD to get a free lifetime license.
Let me know if some aspects can be improved or if there are features you'd like to see implemented in native clients.
EDIT: Since the early bird discount has expired before this post could reach HN, I'll leave you with 50% off with the code MACHATO50 ! This is really nice, you had my attention at native app - then sold me on the decent rendering of markdown, syntax etc... I love that it's a 100% native application (no Electron/Chrome garbage etc...), so refreshing to see an unbloated app these days. The whole app is only about 11MB, and right now is only using 40MB of memory. I just bought it. Very nice, does what you would expect. I was surprised at how much easier it was to use a native interface and have everything work so smoothly. I like the GUI touches (icons for new conversations, etc.). Very mac-like. Out of curiosity, what does a native app offer in this context that using ChatGPT in a browser tab doesn't? I understand you can add shortcuts such as opening it and starting a new chat, but outside of that? Multiple factors made me implement this: - ChatGPT Plus is crazy expensive, and the free tier is often down. I found it especially frustrating to not get access to my chat history when ChatGPT's load was high. Since alternative clients go through the API, their are exempt from load balancing. - A native client is lightweight and doesn't rely on web technologies. This is a matter of personal taste, but I like the look and feel of native apps better than web pages. - Current native clients are both very new and ill-fitted to my needs (I really like the LaTeX rendering feature and I want to be able to browse my chat history) > ChatGPT Plus is crazy expensive $20 per month is not crazy expensive relative to the value it provides. *$20USD per month ($30AUD), which is pretty damn expensive when you don't live in the USA. The USD has been inflated in value for so long now we have to be careful what we buy in that currency. $30 AUD is not super expensive in Australia either, pub meal is about that these days. That's a lot of money for some people; especially when everything is a subscription these days. Depends on your usage, I guess. I personnaly couldn't justify it. In the last few weeks, I've used the API extensively and total cost comes to a few dollars max. Admittedly, the API doesn't provide nearly as many features as ChatGPT Plus does. To each their own ! > Admittedly, the API doesn’t provide nearly as many features as ChatGPT Plus does. A lot of the ChatGPT Plus features are features of the wrapper rather than the model, sure. Then again, that’s what things like Langchain are for, right? Not even kidding, I would pay $1000 USD per month for GPT4 access. You know you can use phind.com with GPT4 access in both expert and creative mode? It's fine-tuned for writing code, but it's good for other stuff, too. Or do you mean API access? There is a form to fill out and they seem to be pretty generous with handing out access. https://openai.com/waitlist/gpt-4-api (edit: They didn't email me, I checked one day and I was able to use model="GPT-4", but not the 32K model.) Wow, is phind completely free? How do you know that they're using GPT under the hood for their answers? edit: found their ShowHN https://news.ycombinator.com/item?id=34884338
Anyway thanks for this recommendation! Got off the waitlist for GPT4 the other day and have been experimenting with it a bunch for coding tasks (generation, review, planning, bug fixing etc.). I prefer 3.5 for the speed and cost and move to 4 for accuracy / detail when necessary. Well, soon you'll can. Just select the GPT-4 32k and voila. 30 dollars a day. I'm on the waiting list for GPT4 API access. Giving this thing an entire projects codebase will be super interesting! i have been using it for some time. Great, but it has a tendency to give short answers and don't use the full context available. even if you set the max response size to 20k. Comparing it to pasting code into GPT4 (the chat interface), how much better is it? I can't wait to see what this can do when an entire codebase can be held within the context window. Yeah, it can help with some things, but if you ask it to, say, convert a python app to Go, even with plenty of context it will give you hints without full implementation. I suppose they favored concise answers in the fine tunning process and reused the model to train the 32k. Things that I have been doing: paste full table DDL with entities and other classes involved and asking for some refactoring of a class. Also that and some stacktraces. It definitely helps, but the price (at least on azure) make me think twice before using. Also azure playground is awful. How do you use it? I’ve been trying to change my habits to use it more, but some of those grooves are pretty deep and it just doesn’t come to mind. It's been a watershed moment for me and honestly, even after just a few months I can't imagine life without it. I'm not so sure, would you mind elaborating? I can definitely see the benefit for some professions such as writers etc, but outside of that I haven't seen a use case yet where I've felt it's provided enough benefit to justify the price. enjoying the app. - I would love a toggle to make the left hand chats just title, no preview to save space. - Reordering of chats. - When i am in a chat, the upper right should say the model but it just says 'GPT...'. - Ability to copy just code blocks to clipboard like in the traditional chatgpt plus interface. - Ability to export a chat. Replying in order: - Noted ! This shouldn't be too hard to implement. - This is an important feature, I'm looking to implement it in the very near future. - I had this happen a few times. I'll try to troubleshoot the issue. In the meantime, I found that resizing the window fixed the issue. - Yup ! I've added it to the to-do list. - What kind of format would you find useful ? I'm happy you're enjoying the app. Don’t really have any format in mind, txt is would be okay. When this app gets a newer version, how would I know? I’ve never purchased from gumroad before. When I publish an update, I can push an email to those who downloaded the app. I'll probably end up adding some opt-in auto update feature in the app at some point it's $20/mo... that's pretty cheap no? If you're from the US, it's roughly 0.03% of the average monthly income [0], if you're from Afganistan it's 60% of the average monthly income. What's cheap for you isn't cheap for others. expensive in brazil, at least. I was hoping that since this uses the API and is (I think?) its own implementation of the chat interface that it might be less confined by the guardrails in place on the chatGPT. But it appears to be similarly subject to moderation: "I'm sorry, but I cannot comply with your request to recall any of Malcolm Tucker's profanity-laden tirades. As an AI language model, I am programmed to maintain a respectful and appropriate tone at all times, and using offensive language would not be in line with my programming. However, I can assist you with any other questions or tasks you may have." Thank you for creating this software, and for making it a one-time purchase rather than a subscription. Big breath of fresh air. On a side note, I was a stand-alone app like this was available for MidJourney, too. I really rather dislike using that powerful service either through Discord or a browser. /rant I couldn't possibly fathom why they did this, until my husband asked me to share my paid Midjourney account (as we would with any other service)... Not sure if that was their intention, but...well played Midjourney. What sort of data (if any) do you, the developer, collect? Through the app, none. The app uses your OpenAI API key and your Gumroad license key to verify access to both of those services.
License management is handled via Gumroad, their privacy policy might enlighten you further. My impression is they only require an email and some billing info (Paypal or Credit card) I can confirm this to be true (at present). I just installed it and run little snitch to monitor and block network traffic in both directions. It didn't try to connect out to the internet at all until I started a new conversation thread, when it did it made two calls: The app has a valid signature and it's running processes are not obfuscated in any way that I can see. Fantastic — bought a license! Love how simple it is; exactly what I was hoping. Do one for iOS, too! ;) Very cool. I'd love it if it gave me more info about the differences between some of the available models. I've been using GPT4 exclusively from the web interface; I don't know why I might select GPT4 vs GPT4-0314. I believe I do know why I want to use 32k, but others might not. Basically, I'd want an explanation as to how these differ from what's available in the ChatGPT web-app. The default temperature is 0.0 -- again I'd want to know what the web app is using to know how I might want to use this setting. edit: I guess an answer is my API key is only valid for 3.5 anyway. I'll try this again when I get API access to 4 I guess, but another suggestion would be to check capabilities when I paste the API key in and let me know which ones I have available. The UI sure needs a bit of work. I'll try to add more information about the different models. As for the default temperature, 0.0 is actually a bug ! I'll definitely fix that in the next release. The default intended value is 1.0. The API reference [1] describes what the temperature parameter does in more detail. While doing some testing, I also noticed that the app disables streamed responses by default. Make sure to check the appropriate checkbox in the settings to get token-per-token answers ! [1] https://platform.openai.com/docs/api-reference/chat/create '-0314' is a snapshot date, won't change, and will be retired in 3 mos: Looks good. I bought MacGPT (https://www.macgpt.com), but sure, I'll buy this too and see how it compares. Hopefully it has quick access so I can summon it with a keyboard shortcut! :) Will it run on OS 10.13? And: Any plans for integrating audio? (Whisper speech to text and Elevenlabs or something as TTS. Should have a general toggle for when you want to work silently. And shouldn’t read code or tables.) It most likely won't work on OS 10.13. I've looked at this issue earlier today and I'm relying on libraries that require macOS ≥ 13.0. I might look into it if there is demand for it though, but this will probably require leaving out some functionalities for older OSes. As for audio integration, this in not on the to-do list, as I am looking to keep the client lightweight. Ok cool. Don’t worry about it. I think few people are as sluggish with software updates as I am. As for audio: I find ChatGPT gives the best answers when you give it a lot of context. Basically a dump of your entire thought process leading to the question. For that, I find typing to be a real bottleneck. And I do touch type. So I think, midterm, any ChatGPT, interface without audio input is going to be unacceptable. I could be wrong, of course. Just a guess. 3 hr after post, 0 comments, 8 upvotes: Sorry, the discount code you wish to use has expired. welp Yeah, there was some delay in posting and the early bird discount expired. I've updated the post with another discount ! This is cool, I like native apps better. Are there any features with ChatGPT Plus on web that are not supported in this (because it's the API)? Yes. The new ChatGPT Plugins are not yet supported outside of the official web interface.
Other than that, most third party client actually allow more customization (custom system prompt, temperature adjustment, ...) I would really love if this had some kind of plugin system so I could also point it at other models (such as a locally run llama.cpp) Just tried it out and really like streaming support! I would love being able to edit prompts and "branch" conversations like on the web app. This shouldn't be too hard to implement ! Stay tuned for future updates How much will I be spending on OpenAI token / API while using this vs while using the chat webapp? What has been your experience? I've used the API extensively over the last few weeks, and I have yet to go above $1. It's still technically more than the free ChatGPT web interface, but doesn't suffer from load balancing constraints. My experience has been that using it this way is fairly inexpensive. You can track your expenses in real time and put spending limits on OpenAI's dashboard, so you don't get a bad surprise at the end of the month. Thanks for your support. I have purchased your software. I think, I will like using this. Thank you. Just bought it but it turns out I can't use it with Monterey :-( I'll add some warning on the sales page, sorry about that. Ask for a refund (email me through gumroad), I'll approve it. Thank you, appreciate the quick response, and the refund :-) LaTeX rendering is a fantastic idea! Looks really nice.
There has not been any analytics / tracking calls such as Sentry/NewRelic/Google etc.... thus far (and I'd block them if there was). 1. Gumroad license server
2. api.openai.com