Two brothers from Syria.
Building ethical AI born from lived experience and a sense of responsibility.

2011
Born in Syria, they migrated to Germany shortly before the war began in 2011. While deeply grateful for the opportunities they found in their new home, they carried a constant awareness of their privilege - people in their homeland were facing devastating conflict while they built futures abroad.
2022
They established successful careers in Germany's tech industry, working for a company they genuinely cherished. Yet, they grew increasingly uncomfortable with the reality that many of their clients were tied to harmful policies, particularly those supporting the Israeli government. As they witnessed the genocide in Palestine unfold, this created a painful contradiction: they were working against their own people.
December 2024
The breaking point came, when Syria's dictatorship finally fell by the end of 2024 - a moment they had never experienced in their lifetimes.
As the country began its rebuilding process, calling for the Syrians abroad, they felt the duty to action. So they made the decision to quit their jobs.
Only then did they turn their attention to the tech landscape, asking themselves: 'What can we actually do with our skills?' What they discovered was deeply troubling. The ecosystem was 'very dark' - one system serving another, promising free tools that worked for the people while actually working against them.
They looked for alternatives and found some options: Signal instead of WhatsApp, Brave Browser for Google. But for ChatGPT? There was no ethical alternative - and that was terrifying considering its massive data-consumption and that AI is the system everyone is using right now, and the foundation everything is being built upon.
Well, there was an alternative for technically savvy people: They could use self-hosted open source LLMs, but that wasn't an option for non-technical users like their auntie, who wouldn't know how to set that up.
Building Thaura
So they created a basic version that could handle conversations which they gave it to their auntie.
“Sometimes I don't want to type... - I'd rather just talk.”
That sparked them to add voice mode. Then she asked about images, so they added visual capabilities too.
With each request, they kept refining and perfecting Thaura, shaping it into the ethical AI people actually needed and wanted to use.
Finding Their Tribe
For a while, it was just them and their circle - building something they believed in, without wider validation. They had created Thaura to be nearly complete without even having an audience yet, working in isolation with only feedback from family and friends.
Then one day, they discovered Tech for Palestine - an incubator for tech startups with the goal to build toward a free Palestine. It was like finding their tribe.
Through this community, they realized they weren't alone - there were others who shared their values and vision.
Now, through Tech for Palestine, they're establishing themselves as the ethical AI alternative the world needs.
Thaura is Alive Today
Make the migration from Big Tech to technology that actually serves the people instead of working against them.