Programs written in Golang have no secrets
gopacker.dev> This tool is only for legitimate programs only. It is strictly forbidden to use it for any malicious programs.
I'm sure the authors of malicious software will be certain to respect this clause
It’s a way of saying "I decline any responsibility if anyone uses this tool for malicious purposes".
What about legitimate programs that are created to help legitimate penetration testers? And are forbidden from being used by people with malicious intent.
Just for the record, OP is talking about the "obfuscation" tool and not about the "decompiler".
I would imagine that it is there in order to protect themselves if someone uses their tool for malicious purposes as they can point to that statement and say they were not complicit.
This reminds me of the once-often used phrase on CraigsList: "No scammers"
I mean, why even put that there?
It makes a bit of sense. Some scammers target the extremely naive, who don’t even know there are scams. Those scammers might skip such victims to save time. These days, when scamming is so automated I doubt it makes a difference.
Even if this program was not obfuscated by garble, compiled in the traditional way, it will still be flagged as malicious by many antivirus engines, regardless of whether it is signed or not.
IMHO these days it's far more about whether your binary is "known" than what it actually does. AV is a protection racket. I realised that long ago when "hello world" executables compiled with MSVC passed, but the same source code ran through GCC was deemed malicious and quickly deleted.
FYI, you have a small typo in the gopacker.dev URL in the docs here: https://gopacker.dev/docs/#how-to-use
> Download gopacker from https://gopcker.dev
Also right below that in title of step 3. - says pakcing instead of packing
3. Pakcing the programs with gopacker
and in the "Packing golang programs" section, digital is spelled incorrectly
> After processing by our tool, 6 engines reported the program as malicious when without signing by *digigal* certificate.
I adore that obfuscated binaries are often treated as malicious by antiviruses. If you obfuscate there is something to hide.
I don't see how a small company with actual know how can protect their trade secrets otherwise. Software patents and other legal means seem much worse to me than trying to obfuscate trade secrets. Going open source is not a viable solution for most companies either.
Why would the company relying on scarcity of knowledge they deem "trade secrets" to gather revenue instead of directly being productive? Expecting unbounded ongoing revenue from a constant amount of work they did once to describe the "trade secrets" into code is extractive and doesn't incentivize future productivity.
Any product that requires many years of R&D has to rely on trade secrets or on software patents, since otherwise the company could not possibly compete with copycats who reverse engineer their know how and therefore have zero R&D costs. There is often no other way, not every market is based on constant fake innovation and feature creep.
It’s still a valid strategy and it makes sense in the current economic system.
It's not really tenable because it just takes someone without a profit motive to reimplement it or something that can substitute for it and the business model breaks. There's countless examples of this.
If anyone can't read it in chrome, firefox makes it readable. In chrome it hides the title and about 20% of the left most text behind the navigation column. I tried playing with the font sizes and it didn't help.
"need to purchase our product to get the license,the packed app is only valid for 7 days without license." and thats USD 299 per year.
Where do all the symbols come from if the binary has been stripped? Does go add its own proprietary symbol table sections to go binaries?
I don't see the author stripping any binaries.
(And features of Go assume the binary is not stripped, too. The symbol table is useful!)
> Here is an application written with golang, decompiled with IDA 8.3 Pro, we can see that IDA not only restores all the function names, but also restores the business code with good readability.
> And this program has used -ldflags "-s -w" to remove the debug information and symbol information when compiling, which shows that if no processing is done for the program written in golang, there will be no secrets at all.
Though perhaps those flags were just not effective. Maybe Go places it's own stuff after the user-specified flags, or maybe they're only used in cgo mode?
Edit: https://words.filippo.io/shrink-your-go-binaries-with-this-o...
It retains the symbols needed to format stack traces, so most symbols remain. So go-link's "-s" is rather different from how normal linkers interpret "-s" (don't link symbols or debug information, functionally equivalent to strip(1)).
Ah, I missed that searching for "strip".
So this lets you circumvent AV detection. How long until they add detections for this too?
Since this article leans on screenshots from Virustotal detecting/not detecting something as malware, it is worth pointing out that just because the engine vendor x uses as part of their participation in Virustotal detected something as malicious, that doesn't mean their commercial products will do the same thing[0]. 1) The vendors all want to participate in Virustotal as marketing, and to get access to the raw samples (it is a reliable high-volume source of commodity malware.) 2) There is pressure to over-detect[1], so your competitors don't send out screenshots of your engine, which they imply to be your product, not detecting some novel malware sample, which of course they only do if their engine, which they also imply to be their product, detects it correctly. 3) Almost all vendors have additional detection mechanisms that don't make it into their Virus Total engines for various reasons.
So, anyway, if you want to get a sense of if, in general, your binary is likely to trip “AV,” then, by all means, submit it to VT. If, however, you want to know if a specific AV detects it, submitting it (or worse, some other binary processed with the same packer) to VT is dumb.
0 - https://docs.virustotal.com/docs/antivirus-differs
1 - interestingly, some researchers found lower detection rates on VT, which they attributed to the vendors not using their cloud-based analysis modules. https://www.researchgate.net/publication/364073462_A_Compara...
Please, let me be that guy...
If you chose to write your program in Go, it is likely not worth to copy. ;-)
I thought this was gonna be about how it's hard to keep secrets out of memory in a GC language. I had a friend working with Go and was surprised that on some heap-dump there was some secret-key they did not want exposed. So you have to set some ENV var (GOGC) so it clears faster.
Reducing the time a secret lingers in memory is at best a mitigation. Linux has MADV_DONTDUMP for the madvise system call to exclude certain pages from core dumps. I'm not sure if something like this could be available in golang, preferably wrapped in some platform-agnostic way.
Also mlock to prevent the memory page being written to disk and make sure to properly overwrite the secret data once you no longer need it. Make sure this doesn't get optimized away.
libsodium has functions for all of that. Rust has the "secrets" crate that is a wrapper around these.
I don't know much Go, but a quick search looks like it has libraries that take care of these things as well - unsurprisingly.
You could bash together system calls to get some memory allocated that way that you could access, but any language that makes values some combination of easy to copy and not having first-class access to the memory layout is going to be a constant uphill battle to keep the secrets where they "belong".
And almost everything nowadays has easy-copy values, because almost everything copies all function parameters. It will be difficult, twitchy, and subject to whatever local compiler/interpreter optimizations as to whether or not any sort of code does or does not safely keep all values only in the "correct" space.
I wouldn't trust Go qua Go to keep my secrets only where I "put" them, and I wouldn't trust hardly any other modern language either. They're all based on copying arguments around. Rust is closest and I'm not sure I'd trust even that without a lot of checking, because while the language layer may have best-of-class controls over sharing, that's not to say the optimizer won't create copies of things under the hood. Those sharing controls are not, as I understand it, hardware-level promises as to how memory will be treated, just language-level promises. You almost have to reduce to a minimal kernel and program it in assembler, if you want to be sure the secrets can't flow, and that may be easier said than done depending on how complicated that kernel is. (e.g., simply reading something from a file and keeping it confined is easy, but if I have to do crypto with a secret that's a very large chunk of code to worry about.)
It isn't even just the software, even the hardware stack is just not designed for the CPU to not make copies. The hardware is designed to present an isolated view of the world to the software it is running on but it has numerous and large abstractions between the view of the world the software running normally sees and the actual state of the hardware. What you will see when those abstractions are penetrated (e.g., a straight-up RAM dump or disk dump) can be difficult to predict. A RAM page swapped to an SSD can physically reside on that SSD indefinitely because the SSD is remapping sectors continuously, you could get unlucky and that's the last thing ever written to that physical bit of the disk if the controller marks it bad. And that's just an example, not the complete list; I wouldn't count on madvise to protect me from all copies without a lot more research. Everything from the highest software layer to the lowest hardware layer is fighting you if you try to exercise this much control over where your data goes.
Oh yea. They moved on from Go and are now fiddling w/Rust. Kinda laughed at me when I said: just use C.
C has one of the worst interfaces available for that kind of thing.
Not only it follows the C's "you just have to remember it every time" convention, but you also have to remember to check if your compiler isn't optimizing the settings away.
I don't see a difference between GC languages and others, so your secret is in a rust heap then what?
Every language have their memory exposed, GC has nothing to do with it.
> I don't see a difference between GC languages and others, so your secret is in a rust heap then what?
The problem with a copying GC is that, even if you clear the memory which had for instance a secret key, it might have been copied from elsewhere as part of a heap compaction, and the old copies of that memory will not be cleared (until accidentally overwritten when the GC reuses that memory for something else). With manual memory management (or a non-copying GC), you can always manually erase the memory before releasing it (but even then, you have to make sure the compiler doesn't optimize out your clearing; for instance, you should use explicit_bzero() in C).
Rust has a variant of this issue: even though it doesn't have a GC, values tend to be moved in memory when transferring their ownership. This can be avoided by keeping them in the heap (within a Box), since it's the pointer to the heap (the Box struct) which tends to be moved, but you have to take care to never move the value out of the Box (or similar like Rc or Arc) before clearing it. The Pin struct might help by making it harder to move the value out of the Box (as long as you understand how to use it correctly; I've always found the Pin struct somewhat confusing).
Go has a clear() function now, but I haven't checked to see how it actually behaves in terms of secret abatement in memory. https://tip.golang.org/ref/spec#Clear
That's not really what clear is for.
There is memguard [1] but I don't know how secure it is.