Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
@PowerHarryG4 Hi, at the moment I have everything running but no working as it should maybe you can go further. Anyways I did change the formula for building squid but I think there will be no difference.

Generate certificates as @Wowfunhappy (Thank you!) installer does, I put them in another folder

cd
mkdir SquidConf
cd SquidConf
openssl req -x509 -newkey rsa:4096 -subj '/CN=Squid' -nodes -days 999999 -keyout squid-key.pem -out squid.pem

Change squid.conf in /etc/squid/squid.conf, mine looks like this

http_port 3128 ssl-bump generate-host-certificates=on cert=/home/pi/SquidConf/squid.pem key=/home/pi/SquidConf/squid-key.pem

tls_outgoing_options cafile=/etc/ssl/certs/ca-certificates.crt
sslcrtd_program /usr/lib/squid/security_file_certgen
#sslcrtd_children 10 startup=5 idle=1

acl localnet src 0.0.0.1-0.255.255.255
acl localnet src 10.0.0.0/8
acl localnet src 100.64.0.0/10
acl localnet src 169.254.0.0/16
acl localnet src 172.16.0.0/12
acl localnet src 192.168.0.0/16
acl localnet src fc00::/7
acl localnet src fe80::/10

acl excluded_domains ssl::server_name .pypi.org .pythonhosted.org
acl apple_domains ssl::server_name_regex ess\.apple\.com$ ^sw.*\.apple\.com$
acl excluded any-of excluded_domains apple_domains localnet
ssl_bump splice excluded
ssl_bump bump all

acl fetched_certificate transaction_initiator certificate-fetching
cache allow fetched_certificate
http_access allow fetched_certificate
sslproxy_cert_error deny all

http_access allow localhost
http_access deny to_localhost
http_access allow localnet
http_access deny all

cache_log /dev/null
access_log none
logfile_rotate 0

For some reason the script to start the service in /etc/init.d/squid was missing so I added, if you have the problem here is the content. I do this quite different but I think It should work.

sudo nano /etc/init.d/squid

#! /bin/sh
#
# squid Startup script for the SQUID HTTP proxy-cache.
#
# Version: @(#)squid.rc 1.0 07-Jul-2006 luigi@debian.org
#
# pidfile: /var/run/squid.pid
#
### BEGIN INIT INFO
# Provides: squid
# Required-Start: $network $remote_fs $syslog
# Required-Stop: $network $remote_fs $syslog
# Should-Start: $named
# Should-Stop: $named
# Default-Start: 2 3 4 5
# Default-Stop: 0 1 6
# Short-Description: Squid HTTP Proxy version 4.x
### END INIT INFO

NAME=squid
DESC="Squid HTTP Proxy"
DAEMON=/usr/sbin/squid
PIDFILE=/var/run/$NAME.pid
CONFIG=/etc/squid/squid.conf
SQUID_ARGS="-YC -f $CONFIG"

[ ! -f /etc/default/squid ] || . /etc/default/squid

. /lib/lsb/init-functions

PATH=/bin:/usr/bin:/sbin:/usr/sbin

[ -x $DAEMON ] || exit 0

ulimit -n 65535

find_cache_dir () {
w=" " # space tab
res=`$DAEMON -k parse -f $CONFIG 2>&1 |
grep "Processing:" |
sed s/.*Processing:\ // |
sed -ne '
s/^['"$w"']*'$1'['"$w"']\+[^'"$w"']\+['"$w"']\+\([^'"$w"']\+\).*$/\1/p;
t end;
d;
:end q'`
[ -n "$res" ] || res=$2
echo "$res"
}

grepconf () {
w=" " # space tab
res=`$DAEMON -k parse -f $CONFIG 2>&1 |
grep "Processing:" |
sed s/.*Processing:\ // |
sed -ne '
s/^['"$w"']*'$1'['"$w"']\+\([^'"$w"']\+\).*$/\1/p;
t end;
d;
:end q'`
[ -n "$res" ] || res=$2
echo "$res"
}

create_run_dir () {
run_dir=/var/run/squid
usr=`grepconf cache_effective_user proxy`
grp=`grepconf cache_effective_group proxy`

if [ "$(dpkg-statoverride --list $run_dir)" = "" ] &&
[ ! -e $run_dir ] ; then
mkdir -p $run_dir
chown $usr:$grp $run_dir
[ -x /sbin/restorecon ] && restorecon $run_dir
fi
}

start () {
cache_dir=`find_cache_dir cache_dir`
cache_type=`grepconf cache_dir`
run_dir=/var/run/squid

#
# Create run dir (needed for several workers on SMP)
#
create_run_dir

#
# Create spool dirs if they don't exist.
#
if test -d "$cache_dir" -a ! -d "$cache_dir/00"
then
log_warning_msg "Creating $DESC cache structure"
$DAEMON -z -f $CONFIG
[ -x /sbin/restorecon ] && restorecon -R $cache_dir
fi

umask 027
ulimit -n 65535
cd $run_dir
start-stop-daemon --quiet --start \
--pidfile $PIDFILE \
--exec $DAEMON -- $SQUID_ARGS < /dev/null
return $?
}

stop () {
PID=`cat $PIDFILE 2>/dev/null`
start-stop-daemon --stop --quiet --pidfile $PIDFILE --exec $DAEMON
#
# Now we have to wait until squid has _really_ stopped.
#
sleep 2
if test -n "$PID" && kill -0 $PID 2>/dev/null
then
log_action_begin_msg " Waiting"
cnt=0
while kill -0 $PID 2>/dev/null
do
cnt=`expr $cnt + 1`
if [ $cnt -gt 24 ]
then
log_action_end_msg 1
return 1
fi
sleep 5
log_action_cont_msg ""
done
log_action_end_msg 0
return 0
else
return 0
fi
}

cfg_pidfile=`grepconf pid_filename`
if test "${cfg_pidfile:-none}" != "none" -a "$cfg_pidfile" != "$PIDFILE"
then
log_warning_msg "squid.conf pid_filename overrides init script"
PIDFILE="$cfg_pidfile"
fi

case "$1" in
start)
res=`$DAEMON -k parse -f $CONFIG 2>&1 | grep -o "FATAL: .*"`
if test -n "$res";
then
log_failure_msg "$res"
exit 3
else
log_daemon_msg "Starting $DESC" "$NAME"
if start ; then
log_end_msg $?
else
log_end_msg $?
fi
fi
;;
stop)
log_daemon_msg "Stopping $DESC" "$NAME"
if stop ; then
log_end_msg $?
else
log_end_msg $?
fi
;;
reload|force-reload)
res=`$DAEMON -k parse -f $CONFIG 2>&1 | grep -o "FATAL: .*"`
if test -n "$res";
then
log_failure_msg "$res"
exit 3
else
log_action_msg "Reloading $DESC configuration files"
start-stop-daemon --stop --signal 1 \
--pidfile $PIDFILE --quiet --exec $DAEMON
log_action_end_msg 0
fi
;;
restart)
res=`$DAEMON -k parse -f $CONFIG 2>&1 | grep -o "FATAL: .*"`
if test -n "$res";
then
log_failure_msg "$res"
exit 3
else
log_daemon_msg "Restarting $DESC" "$NAME"
stop
if start ; then
log_end_msg $?
else
log_end_msg $?
fi
fi
;;
status)
status_of_proc -p $PIDFILE $DAEMON $NAME && exit 0 || exit 3
;;
*)
echo "Usage: /etc/init.d/$NAME {start|stop|reload|force-reload|restart|status}"
exit 3
;;
esac

exit 0

I believe that you have to change the permissions of the file.

sudo chmod 4755 /etc/init.d/squid

Check if squid starts and it status (or reboot the RPI), as I said it should work but I'm not sure, I did this different and more time consuming

sudo systemctl start squid.service
sudo systemctl status squid.service

For some reason squid shows that pinger was closing , so I found that I have to change its permissions to fix it

sudo chmod 4755 /lib/squid/pinger

Copy the file squid.pem to you mac and add it to the Keychain Access, and change the proxy configuration in network preferences with the IP of the RPI.

At this point I have squid running and the mac using the proxy but I can't reach wikipedia.

To my understanding the mac is connected to the proxy I can reach almost the same pages that before (I don't see the message in LWK "the proxy can't be reach"). Maybe its something with the configuration file or my IPs assigment (My devices are assigned from xx.xx.xx.43 to .60). or the subnet mask here it's 255.255.255.0 that differs the specified in the squid.config file. I could be wrong I'm learning and it's "work" in progress

As a rare fact I see that the pihole is affected by the connection with squid as if it was omitted for the use of the proxy (I don't know if this is spected)

Thank you for writing all that. I've been following along and for some reason when I do 'sudo systemctl start squid.service' it says 'Failed to start squid.service: Unit squid.service not found.'. I must have missed a step at some point but i'm not sure which one.

Edit: I managed to get squid to run by running 'sudo /etc/init.d/squid start'. I have no idea if this is right but it's working as a proxy now. Pages such as wikipedia I haven't gotten to work yet.
 
Last edited:
@PowerHarryG4 I see, as I said I do the things quite different, try installing squid from apt and remove it without purging

sudo apt update
sudo apt install squid
sudo apt remove squid

And install again the squid you build, that fix the issue for me, but I believe that putting the file and getting the correct permission will fix it more quicky.

After that, It should start squid after a reboot or using

sudo systemctl start squid
 
@PowerHarryG4 I see, as I said I do the things quite different, try installing squid from apt and remove it without purging

sudo apt update
sudo apt install squid
sudo apt remove squid

And install again the squid you build, that fix the issue for me, but I believe that putting the file and getting the correct permission will fix it more quicky.

After that, It should start squid after a reboot or using

sudo systemctl start squid
It’s now saying ‘failed to start squid.service is masked’. Do you know what that means?
 
@PowerHarryG4 Do you stop the service before?
sudo systemctl stop squid.service or sudo /etc/init.d/squid stop

I thing you should stop it if you have running, delete squid file from /etc/init.d sudo rm /etc/init.d/squid install squid it from apt, remove it whitout purging the config files, install the version that you compiled and, why not, restart the Pi. (I'm guessing that you should do, I'm as you, a newbie in this kind of things)🤞.
 
You might also want to start Squid via /etc/init.d/squid -d 0 so it prints messages to the console. (If it's already running, stop it first.)
 
  • Like
Reactions: NewbiePPC
Carl is one of those C projects which are extremely easy to compile.
However, it needs to be hooked up to inetd to work as a proxy, which is not something I've ever worked with before...
Been trying to get this to work. Built carl, also built micro_inetd (http://acme.com/software/micro_inetd/ ), and it looks like they both run, but i'm not sure how to connect LWK to it. I thought changing network settings to localhost 3128 would do it, but the same https sites fail as they did with no proxy. I also added carl to /etc/services for port 3128. Running netstat -a shows the proxy is indeed running. Any ideas? I feel im on the right path, but maybe missed something.

carl-test.png
 
Hi, After removing localnet from the line acl excluded any-of excluded_domains apple_domains localnet in squid.conf LWK can reach wikipedia (and others) using the RPI with squid proxy.

The BIG downside it's TenFiveTubev5, PPCMC7.2.3 and my TFFBoxes no longer works. I suppose that this is expected, but, hey It's working and if I want to use the other apps I just turn off the proxy in settings. I take it.

In the RPI side everything looks ok and working flawlessly (PiHole and squid), the weird behavior that was letting the ad go through the browser is still present but in a lesser extend (problably I have to update my adlist because in pihole it seems to be filtering).

Thank you! this is huge to me @Wowfunhappy @wicknix :)

Picture 2.png
 

Attachments

  • TenFiveTube.png
    TenFiveTube.png
    87.2 KB · Views: 142
  • TFFBox.png
    TFFBox.png
    86.7 KB · Views: 139
Last edited:
Glad you got it running. Not sure about tenfivetube as it works on my end. For your fox boxes and/or anything Mozilla based go in to TFF's preferences and change the network settings to not use a proxy. By default they use the system proxy. They should start working again after that.

Cheers
 
^ Just echoing Wicknix about proxy settings in Firefox-based browsers. The other option is to tell Firefox to trust the Squid certificate (the reason this is coming up at all is because Firefox ignores Keychain Access). But, since TenFourFox can handle modern TLS, there's really no reason to use the proxy at all.

Hi, After removing localnet from the line acl excluded any-of excluded_domains apple_domains localnet in squid.conf LWK can reach wikipedia (and others) using the RPI with squid proxy.
Huh, that's interesting! I added localnet so the proxy wouldn't be used when accessing other computers within your own network. I needed this to use docker-machine, for example. That won't be an issue on PPC, but I still wonder what's going on there.
 
  • Like
Reactions: NewbiePPC
Maybe I touch something because I'm already have not use proxy in TFF preferences as the readme of Wowfunhappy said. I'll try again with a fresh Mac OSX install.
 
I'm already have not use proxy in TFF preferences as the readme of Wowfunhappy said.
Well that's odd. By definition, if TFF isn't using a proxy, then enabling a proxy shouldn't change how it behaves.

Bug in TFF, perhaps?
 
I also want to add, maybe we don't need LWK for wikipedia and some smaller sites that won't load, SeaMonkey PPC 2.6 runs wikipedia and everything else, so there is another maintained PPC browser from 2015 still working. TLS 1.2 is supported.
 
  • Like
Reactions: NewbiePPC
@repairedCheese or anyone else trying this on Leopard....
I *think* i made some headway. Been using the proxy for about an hour with no issues. I made some changes to squid.conf and so far it seems rock stable.
Attached below is the new config. Make a backup of the original, then uncompress and copy this over to /Library/Squid. Then start/restart squid and let me know the results.

Thanks.

Edit: NVM. Was fine, rebooted, now it's wigging out again. :-/
 

Attachments

  • squid.conf.zip
    453 bytes · Views: 124
Last edited:
  • Like
Reactions: repairedCheese
Hi, After removing localnet from the line acl excluded any-of excluded_domains apple_domains localnet in squid.conf LWK can reach wikipedia (and others) using the RPI with squid proxy.

The BIG downside it's TenFiveTubev5, PPCMC7.2.3 and my TFFBoxes no longer works. I suppose that this is expected, but, hey It's working and if I want to use the other apps I just turn off the proxy in settings. I take it.

In the RPI side everything looks ok and working flawlessly (PiHole and squid), the weird behavior that was letting the ad go through the browser is still present but in a lesser extend (problably I have to update my adlist because in pihole it seems to be filtering).

Thank you! this is huge to me @Wowfunhappy @wicknix :)

View attachment 1753937
While this is great to continued use of LWK, There is another browser which was just recently updated by a PowerPC diehard - Look at Seamonkey for PPC which was last updated by the PowerPC developer - it simple just works and Wikipedia also works great. I like to also add, it is possible via oldweb.net to access a backdoor for wikipedia from the past and from there, its possible to search anything you want, even up to 2021. For example, in OS 9 and also Leo - I typed http://oldweb.net and chose Wikipedia from the past and was able to enter it via the following browsers:

Classila, Netscape 4.7, IE 5.1.7 and I was able to view Wikipedia in all its glory and when I typed in search Big Sur, it found it. I then tried this in LWK and it too also works.
 
@repairedCheese or anyone else trying this on Leopard....
I *think* i made some headway. Been using the proxy for about an hour with no issues. I made some changes to squid.conf and so far it seems rock stable.
Attached below is the new config. Make a backup of the original, then uncompress and copy this over to /Library/Squid. Then start/restart squid and let me know the results.

Thanks.

Edit: NVM. Was fine, rebooted, now it's wigging out again. :-/
Only works under Snow Leopard, right?
 
I also added carl to /etc/services for port 3128. Running netstat -a shows the proxy is indeed running. Any ideas?
No ideas per se, but...

I thought changing network settings to localhost 3128 would do it, but the same https sites fail as they did with no proxy.

🤔 Well that's interesting. Because if you added an HTTPS proxy in System Preferences, but the proxy wasn't running at all, no https websites would connect, even ones that normally worked, right?

So you've successfully set up a proxy server, it just isn't mitm'ing https traffic like it's supposed to. Perhaps micro-inet is running, but it isn't actually talking to carl?
 
Yeah, i'm stumped. Micro-inetd works fine when i replace carl with micro-proxy, but it only seems to work with regular http sites though. So i do believe carl is indeed started and running (as shown in those images). Maybe LWK is too new for carl? My next step with be trying with an ancient mozilla based browser as i don't know how to add this to webkit.
The reason the browser prefers to use CONNECT is so the connection between the server and the browser is encrypted end-to-end (all the proxy is doing, in this case, is shoveling data back and forth). However, this means the browser is doing the encryption, which is not what we want. 9.3.4b adds a new preference called network.http.proxy.use-http-proxy-for-https which says that the browser should make an unencrypted request for an encrypted resource and defer the encryption to the proxy. Find this preference in about:config and set it to true.

Now view any https:// URL. The request will be forwarded to Crypto Ancienne, which will do the encryption for you.
 
Wicknix, I know this maybe a little off topic, but is it possible to reprogram older browsers with updated rendering ? For example, since I have the source code for Netscape - can a new rendering engine be re-programmed with updated securities ?
 
Not unless you are God. ;-)
Anything is possible. However what you are asking is next to impossible when it comes to old OS's. Nobody here has the knowledge, the money, or the thousands of man hours it would take just to make a few hundred retro Mac users happy.
 
So, realistically speaking it’s possible, though I guess is it worth the time and effort at this stage, though I do much want to do it.
 
Ok, here is another idea - say I wanted to go to Starbucks with my PB G4 titanium abs browse the internet - is there a way to set up proxy on my iPhone 7, so it can allow the old browser (LWK), access those sites it can’t on its own ?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.