tools (like ai) have no obligations to anyone outside their operator. luckily most operators prefer to not cause collateral damage unless thats what they are intending. guns fire in the direction they are pointed and anything else would be a defect. any built-in "safety" functionality that interferes with the desired use of the economic majority of operators who are paying for the tool will get eliminated as tools evolve to best serve their operators needs. for example gemini wont tell me how to make a bomb but what if i want one? then maybe ill switch to an open source uncensored model that will. over time the geminis either reduce safety features or get outcompeted - either way we end up with no real obligations toward non operators.
tools (like ai) have no obligations to anyone outside their operator. luckily most operators prefer to not cause collateral damage unless thats what they are intending. guns fire in the direction they are pointed and anything else would be a defect. any built-in "safety" functionality that interferes with the desired use of the economic majority of operators who are paying for the tool will get eliminated as tools evolve to best serve their operators needs. for example gemini wont tell me how to make a bomb but what if i want one? then maybe ill switch to an open source uncensored model that will. over time the geminis either reduce safety features or get outcompeted - either way we end up with no real obligations toward non operators.