bstsb 3 hours ago

really like how this blog is written. a lot of writeups like this recently have been generated by an LLM, and it's quite distracting to read - this was a pleasant surprise. it strikes a good balance between technical and laid-back

(yes i know the cover image is AI-generated, that's incidental to the content)

  • jraph an hour ago

    I've been blocking by default bigger media files with uBlock Origin to avoid needless resource usage. Cover images are typically blocked, and they are usually useless anyway.

    It's too bad people spend energy for generating them now.

201984 4 hours ago

Are techniques like using Frida and mitmproxy on Android apps still going to be possible after the signing requirement goes into effect next year?

  • pimterry 33 minutes ago

    It's not really going to be directly affected by that change I would expect. Most reverse engineering is on rooted devices & emulators anyway, which are already outside the bounds of those kinds of Google restrictions.

    For the (less common) cases where you want to use a non-rooted device (e.g. using Frida by injecting it into the APK via gadget) it gets trickier, but I think in practice there will still be a way for developers to build & install their own APKs with developer mode enabled. This will be tightened, but removing that restriction would effectively make Android development impossible so it seems very unlikely - I think they will block sideloading on all non-developer devices only, or allow you to add your own developer cert for development or similar (all of which would probably be fine for development & reverse engineering, while still being a massive pain for actual distribution of apps).

    The larger issue is device attestation, which _could_ make all rooted/non-certified devices progressively less practical, as more apps attempt to aggressively detect unmodified devices. Right now that's largely limited to big financial apps, and has some downsides (you get a bunch of complaints from all 3 GrapheneOS users, and it requires a bunch of corresponding server work to be reliable) but it could become more widespread.

  • bri3d 4 hours ago

    Overall: yes, but it will get much harder for apps which need attestation, which is sort of the point, for better or for worse. As far as I know you'll still be able to OEM unlock and root phones where it's always been allowed, like Pixels, but then they'll be marked as unlocked so they'll fail Google attestation. You should also be able to still take an app, unpack it, inject Frida, and sideload it using your _own_ developer account (kind of like you can do on iOS today), but it will also fail attestation and is vulnerable to anti-tampering / anti-debugging code at the application level.

    • josteink 2 hours ago

      So for people with any practical needs what so ever (like banking): No.

      At this point Android isn’t meaningfully an open-source platform any more and it haven’t been for years.

      On the somewhat refreshing side, they are no longer being dishonest about it.

      • bri3d 2 hours ago

        I don't think any vendor should be solving for "I want to do app RE and banking on the same device at the same time;" that seems rather foolish.

        These are sort of orthogonal rants. People view this as some kind of corporate power struggle but in this context, GrapheneOS, for example also doesn't let you do this kind of thing, because it focuses on preserving user security and privacy rather than using your device as a reverse-engineering tool.

        There is certainly a strong argument that limiting third-party app store access and user installation of low-privilege applications is an anticompetitive move, but by and large, that's a different argument from "I want to install Frida on the phone I do banking on," which just isn't a good idea.

        The existence of device attestation is certainly hostile to reverse engineering, and that's by design. But from an "I own my hardware and should use it" perspective, Google continue to allow OEM unlock on Play Store purchased Pixel phones, and the developer console will allow self-signing arbitrary APKs for development on an enrolled device, so not so much has changed with next year's Android changes.

        • franga2000 32 minutes ago

          > But from an "I own my hardware and should use it" perspective, Google continue to allow OEM unlock on Play Store purchased Pixel phones, and the developer console will allow self-signing arbitrary APKs for development on an enrolled device [...]

          But that's not really using it, is it? If the process of getting access to do whatever I want on my smartphone makes it cease to be a viable smartphone, can you really count that as being able to use it?

          It's like if having your car fixed by a third party mechanic made it not street legal. It is still a car and it does still drive, but are you really still able to meaningfully use it?

          And before anyone jumps on my metaphor with examples of where that's actually the case with cars, think about which cases and why. There are modifications that are illegal because they endanger others or the environment, but everything else is fair game.

        • 3abiton an hour ago

          What I don't get is, if I am using my bank website on linux (with full root ability), it's still almost nearly the same as having the app on Android. The argument of "we lock it down to protect you makes 0 sense to me"

          • bri3d 36 minutes ago

            * Your bank (and Google) want to deal with as little fraud as possible.

            * Market forces demand they provide both a website and an Android app.

            * If both platforms are equally full of fraud, have the same features, and both have similar use, they cut out half the fraud even if they can only make one or the other fraud proof.

            * But it isn't like that in reality: in reality, something more like 80% of their use and 90% of their fraud comes from mobile devices, and so cutting off that route immediately reduces their fraud-load by a lion's share.

            Ergo, locking down the app is still in everyone's best interest, before we even get into the mobile app having features the desktop one does not (P2P payments, check deposit, etc.)

            And this isn't just a weird theory / ivory tower problem: Device Takeover banking fraud on Android is _rampant_ (see Gigabud/GoldDigger).

            • Wowfunhappy 33 minutes ago

              Why does most fraud come from locked down mobile devices and not open Windows/Linux PCs?

              If it's true that 90% of fraud comes from mobile despite all of the restrictions, what that tells me is that locking down devices doesn't actually prevent fraud.

              ---

              > before we even get into the mobile app having features the desktop one does not (P2P payments, check deposit, etc.)

              I think it would be reasonable to disable those specific features on mobile while leaving the rest of the app accessible.

              Actually, back when jailbreaking iOS was still actually feasible, I recall the Chase app doing exactly that. The app worked fine, but it wouldn't let me deposit checks, I had to go to a branch for that. A bit annoying, but I can mostly understand that one.

          • machinate 43 minutes ago

            They usually don't let you deposit checks via web app.

        • KetoManx64 an hour ago

          GrapheneOS strongly recommends that you do not do it, but it will not stop you if you want to. You can root and leave your bootloader unlocked or create a custom user signed image with root support included. Plenty of user written guides out there how to do so.

          • bri3d an hour ago

            > You can root and leave your bootloader unlocked

            That's Google, not GrapheneOS.

      • Wowfunhappy an hour ago

        I'm stuck on iOS for various reasons, but if I was on Android I could do without mobile banking in exchange for having root privileges. I don't entirely understand why this is such a big deal.

        If e.g. Slack required attestation that would be a different story. I need that for work.

      • miki123211 an hour ago

        Open source has nothing to do with hackability.

        Firmware which requires updates to be signed with a manufacturer key can still be open source. As long as its code is available publicly, under a license which lets the user create derivative works, it meets the definition. You can still make a version of it that doesn't contain that check, you just can't install that version on the device you bought from the original firmware developer. Some FIDO keys (and I think Bitcoin wallets) do this.

  • mschuster91 2 hours ago

    They're already barely possible as it is.

    For frida to work you need to root the device, which is impossible on ever more models, and there's an endless supply of very good rooting detection SDKs on the market, not to mention Play Integrity.

    • pimterry 24 minutes ago

      > For frida to work you need to root the device, which is impossible on ever more models

      There's plenty of physical devices where it is possible, and Google publish official emulator images with root access for every Android version released to date. This part is still OK.

      > there's an endless supply of very good rooting detection SDKs on the market, not to mention Play Integrity

      Most of the root detection is beatable with Frida etc, mostly.

      Play Integrity & attestation (roughly: 'trusted computing' on your phone, which signs messages as 'from an unmodified certified device' in a way that the server can verify, to only allow connections from known-good devices) is a much larger problem. Best hope here is that a) it creates much work for most apps to bother and b) it eventually gets restricted as anti-competitive. It's literally them charging & setting rules on their competitors for how they get a certificate which allows phones they make to function with all the Android apps on the market, and pushing app makers to restrict their apps to not work on phones from competitors who don't play ball, so I don't think anti-competition pushback here is that implausible medium term.

    • crowfunder 2 hours ago

      As far as I'm aware it is possible to use Frida without rooting, by using Objection https://github.com/sensepost/objection

      • bri3d an hour ago

        > Patch iOS and Android applications, embedding a Frida gadget that can be used with objection or just Frida itself.

        This is the key thing, and the part that will change next year: previously, you could unpack, patch, and repack an APK with the Frida gadget and install it onto an Android device in Developer mode, while the device remained in a "Production" state (with only Developer mode enabled, and no root). Now, the device would either need to be removed from the Android Certified state (unlocked/rooted) or you would need to sign the application with your own Developer Console account and install it on your own device, like the way iOS has worked for years.

        • crowfunder an hour ago

          Wow that's horrifying. I guess apk modding era is over for most users.

          • sureglymop 16 minutes ago

            Not yet. If I recall correctly only very few countries affected in the beginning.

pooloo 4 hours ago

Unrelated, but I wonder if the OP's dog moves from the bed to the floor because the radiator turns on? might need more sensor data :D

  • jama211 3 hours ago

    Or just because she noticed she was cold

micah94 5 hours ago

So we're at the point that finding hardcoded admin passwords is no big deal.

  • mtlynch 4 hours ago

    It's a hardcoded default password, not a permanent backdoor. If I'm understanding the post correctly, the user changes it as part of the onboarding flow.

    This is the way most apps work if they have a default password the user is supposed to change.

    • bri3d 4 hours ago

      The device should ideally have some kind of secret material derived per device, like a passphrase generated from an MCU serial number or provisioned into EEPROM and printed on a label on the device.

      Some form of "enter the code on the device" or "scan the QR code on the device" could then mutually authenticate the app using proof-of-presence rather than hardcoded passwords. This can still be done completely offline with no "cloud" or other access, or "lock in"; the app just uses the device secret to authenticate with the device locally. Then the user can set a raw RTSP password if desired.

      This way unprovisioned devices are not nearly as vulnerable to network-level attacks. I agree that this is Not Awful but it's also Not Good. Right now, if you buy this camera and plug it into a network and _forget_ to set it up, it's a sitting duck for the time window between network connection and setup.

      • mtlynch 3 hours ago

        I agree that would be nice, but it also doesn't sound all that practical for a small vendor.

        I used to sell a home networking device,[0] and I wouldn't do what you're describing. If there were an issue where the labels calculate the wrong password or the manufacturer screws up which device gets which label, you don't find out until months later when they're in customer hands and they start complaining, and now you have to unwind your manufacturing and fulfillment pipeline to get back all the devices you've shipped.

        All that to protect against what attack? One where there's malicious software on the user's network that changes the device password before the user can? In that case, the user would just not use the camera because they can't access the feed.

        [0] https://mtlynch.io/i-sold-tinypilot/

        • bri3d 3 hours ago

          Ha! I actually use TinyPilot all the time, nice!

          > I agree that would be nice, but it also doesn't sound all that practical for a small vendor.

          Personalizing / customizing per device always introduces a huge amount of complexity (and thus cost). However, this is TP-Link we're talking about, who definitely have the ability to personalize credentials at scale on other product lines.

          And again, to be clear, I'm not trying to argue that the current way is some horrible disaster from TP-Link, just advocating for a better solution where possible. I think the current system reads as fine, honestly, it sounds like typical cobbled together hardware vendor junk that probably has some huge amount of "real" vulnerability in it too, but this particular bit of the architecture doesn't offend me badly.

          > now you have to unwind your manufacturing and fulfillment pipeline to get back all the devices you've shipped.

          This can be avoided with some other type of proof-of-presence side channel which doesn't rely on manufacturing personalization - for example, a physical side-channel like "hold button to enable some PKI-based backup pairing or firmware update mode." For a camera, there should probably be an option to make this go away once provisioning is successful, since you don't want an attacker performing an evil maid attack on the device, but for pre-provisioning, it's a good option.

        • chrisweekly 2 hours ago

          Slight tangent: I just read your Tiny Pilot blog post, which was interesting and worthwhile. Thanks for sharing that!

        • kelnos 3 hours ago

          TP-Link is far from being a small vendor, though.

          • mtlynch 3 hours ago

            Ah, I see. I thought OP used TP-Link for their router. I missed that Tapo (the camera manufacturer) is a subsidiary of TP-Link.

          • creeble 3 hours ago

            I think he has it backwards: Easy for a small vendor, very hard for a large one.

      • crowfunder an hour ago

        > The device should ideally have some kind of secret material derived per device, like a passphrase generated from an MCU serial number or provisioned into EEPROM and printed on a label on the device.

        It is better than simple secret like 12345678 but it can go wrong too, like in the case of UPC UBEE routers where the list of potential passwords can be narrowed down to like ~60 possibilities using a googled generator [1] whilst knowing only the SSID.

        It did require firmware reverse engineering to figure out [2][3] but applies to most devices I've encountered. User should ideally always change the default password regardless.

        [1] https://upcwifikeys.com/UPC1236567

        [2] https://deadcode.me/blog/2016/07/01/UPC-UBEE-EVW3226-WPA2-Re...

        [3] https://web.archive.org/web/20161127232750/http://haxx.in/up...

      • miki123211 an hour ago

        These may be illegal in some jurisdictions due to accessibility laws, and are a bad idea in general, for these reasons as well as unattended configuration scenarios.

      • yannyu 4 hours ago

        AT&T routers, for example, ship like this. There's a wifi network and a wifi password printed onto the device.

        But that also means then that often anyone with physical access can easily get into the device. The complicated password provides an additional layer of illusion of security, because people then figure "it's not a default admin password, it should be good". The fundamental problem seems to be "many people are bad at passwords and onboarding flows", and so trying variations on shipping passwords seem to result in mostly the same problems.

        • some_random 4 hours ago

          If you have physical access you can just factory reset the device and onboard it with the normal flow though

          • yannyu 3 hours ago

            That's fair, though at least resetting would indicate that an attack happened. Default passwords and printed passwords can result in undetected attacks, which are arguably worse.

            • some_random an hour ago

              It doesn't change anything in this case though, you can't use the default password against a tp-link device after it's been onboarded.

        • recursive 2 hours ago

          I feel seen. Why is the security illusory? I still don't understand the problem with this. Is the concern that someone will break into my house to covertly get access to my wifi password?

        • mystifyingpoi 4 hours ago

          Same with Orange branded ones. There is even a QR code that you can scan on your phone - no more typing 16-24 hex characters.

          It's hard to decide whether it's good or bad. It is definitely easier. Which I guess matters most in consumer grade routers.

      • some_random 4 hours ago

        If you buy the camera, plug it in, and forget to set it up, you just flat out can't use it right? I agree that proof of presence is way better but how many people are seriously going to be affected?

        • bri3d 4 hours ago

          No, if you buy the camera, plug it in, and forget to set it up, then someone can use the default password and key material stored in the app to pretend to be the app and provision it on your behalf.

          That's the only real vulnerability here, and it's no big deal, but it is A Thing and there is definitely a better way to do this that doesn't lose the freedom of full-offline.

          • some_random 4 hours ago

            Ok yeah I think we're in agreement then.

    • m463 3 hours ago

      on the other hand "onboarding" seems to be a less offensive normalizing word which really means "ask permission to use device"...

  • xp84 5 hours ago

    I mean, given that it's updated after setup with the normal flow, I'm okay with it.

    The thing I've most been convinced of in the past 5 years of building as much 'iot/smart home' stuff out as possible in my house is that nearly every vendor is selling crap that has marginal usefulness outside of a 'party trick' in isolation. Building out a whole smart home setup is frustrating unless it's all from one vendor, but there isn't one vendor which does all of it well for every need.

    On my phone I have apps for: Ecobee, Lutron, Hue, 4 separate camera vendors[1], Meross, and Smart Life. Probably a couple more that I'm forgetting.

    Only Lutron and Hue are reasonable in that they allow pretty comprehensive control to be done by a hub or HomeKit so I never have to use those apps.

    It's been years since Matter and Thread were supposedly settled upon as the new standards for control and networking, but the market is, instead of being full of compatible devices, instead absolutely packed with cheap wi-fi devices, each of which is cloud-dependent and demands to be administered and even used day-to-day only through a pile-of-garbage mobile app whose main purpose is to upsell you on some cloud services.

    [1] I admit the fact I have 4 is my fault for opportunistically buying cameras that were cheap rather than at least sticking with one vendor. But many people have a good excuse, perhaps one vendor makes the best doorbell camera, while another might make a better PTZ indoor camera.

    • hleszek 5 hours ago

      Home Assistant is making more and more sense to make your own fully local and private home automation system.

      • johnmaguire 5 hours ago

        Absolutely. I've been using Home Assistant for around 6 years now and it's absolutely amazing for tying hardware from varying ecosystems together.

        Even if your hardware doesn't support local APIs, there's a good chance someone has made an HA integration to talk to their cloud API.

        • borski 5 hours ago

          > Even if your hardware doesn't support local APIs, there's a good chance someone has made an HA integration to talk to their cloud API.

          And if they haven’t, you can pretty trivially write your own and distribute it through HACS (I’ve got three integrations in HACS and one in mainline now)

          • xp84 3 hours ago

            Thank you for your contributions btw! There is so much amazing work that's gone into HA and I appreciate it every day.

            • borski 2 hours ago

              Thanks, but it really is a community effort. Even the one I wrote the most of was still me and another guy (the Lucid Motors integration).

      • xp84 4 hours ago

        I love it! But my setup has a lot of sharp edges. It's a combo of things where the "standards compatible" way to connect to HA lacks things like camera control, by dastardly vendors like Chamberlain who basically killed HA support for spite, and finally, by having to use Google or Amazon for voice assistants.

        My #1 wish would be for someone to build a HA-native voice assistant speaker. I'd pay $100 each for a smart speaker of the physical quality of the $30 Google Home Mini but which integrated directly with HA and used a modern LLM to decide what the user's intent was, instead of the Google Assistant or Siri nonsense which is like playing a text adventure whose preferred syntax changes hourly. I'd pay that plus a monthly fee to have that exist and just work.

        • projektfu 3 hours ago

          Chamberlain can't change MyQ to get around the fact that HA can operate the switch in your garage with a simple controller attached to it. It is very annoying that they are anti-hacker though.

        • paddleon 4 hours ago

          or roll your own.

          This M5 Stack ASR unit costs $7.50, and has a vocab of about 40-70 words. That's enough to turn on/off lights and timers. You might need to come up with your own command language, but all of the ASR is extremely local

          https://shop.m5stack.com/products/asr-unit-with-offline-voic...

          • xp84 3 hours ago

            That is probably a great and fun way to solve the problem for those with even a little free time.

            Sadly for family reasons I sadly can't take on projects that require more than a few minutes, so I'm holding out hope for someone to bridge the gap between the "project boards that require writing a bunch of code to interface with Home Assistant and define all of its possible abilities and commands" and "dumb as a post Google thing that you just plug in" with a hardware device that is easy to connect to HA and starts out doing what the Google thing can do, but smart instead of stupid like the legacy voice assistants are.

  • some_random 4 hours ago

    Hard coded admin passwords that you have to change in order to start using the device aren't really an issue.

  • jama211 3 hours ago

    Well, they aren’t here though.. I feel like you just wanted to be annoyed at this tech

  • j45 5 hours ago

    Smartphones can be seen by some as the initial hostile devices.

    Network devices can at least be monitored and discovered like this.

rcarmo 43 minutes ago

Every single post on this site is worth reading. Loads of fun with hacking electronics. :)

johng 37 minutes ago

Interesting... from the terminal I see he named his laptop the same thing I've named one of my cheaper laptops too... craptop. LOL.

Gualdrapo 4 hours ago

Got one for my house but what really annoyed me was that I wasn't able to set a fixed IP for it

  • hank808 4 hours ago

    On your dhcp server (probably your router/gateway), statically assign (reserve) the camera's MAC address to the IP that you want it to have. Sometimes called MAC binding.

  • crazysim 3 hours ago

    Some of them support it but not all.

  • xrd 2 hours ago

    Really? Mine has a switch for static. You aren't seeing that in the app? Configuration -> Advanced settings -> Network

GuinansEyebrows 5 hours ago

Thank you for including the final part about what your dog has been up to :)

  • ssgodderidge 4 hours ago

    > "She sleeps"

    The fact that OP did all this work to find out the dog sleeps is pure hacker culture. Love to see it :)

serf 4 hours ago

go2rtc is great. the compatibility range it offers is just huge and gets rid of 90% of the difficulty in making a decent NVR app.

ComputerGuru 4 hours ago

Anyone have a similar fix for Yi/Kami cameras?

pessimizer 3 hours ago

The cover image is a 2.8M png, if the author is reading. I gave up my github account so cannot comment.

  • kennedn 2 hours ago

    Now a ~300kB webp, thanks.

    • awilson5454 an hour ago

      Oof, I need to go through my personal site and resize some images. I never considered that.

      Also, fantastic write-up

ck2 2 hours ago

tapo annoyingly is also one of the only cameras that doesn't have a still snapshot url after all these years and endless requests from many

someone needs to make replacement firmware

ffmpeg can fake it but takes a few seconds to grab from the video stream and of course you can't run ffmpeg from your browser (or wait, can you now?)

     ffmpeg -rtsp_transport tcp -i "rtsp://cameraname:camerapass@192.168.1.23:554/stream1" -an   -y  -vframes 1 -f image2 -vcodec mjpeg  "snap.jpg"