Computerworld.com [Hacking News]

Syndikovat obsah
Making technology work for business
Aktualizace: 10 min 30 sek zpět

Now, you can create a digital copy of your personality in just two hours

9 Leden, 2025 - 21:23

Researchers at Google Deepmind and Stanford University have concluded that a two-hour interview is sufficient to create a realistic AI copy with the same personality as the interviewee.

In an experiment, 1,052 people were interviewed using a questionnaire that addressed everything from personal life events to opinions about society. A digital AI copy was then created and when a new round of questions was asked, it answered the same as its human counterpart in 85% of cases.

According to the researchers, AI copies of real people can be used in a wide range of contexts, but there are also risks with the technology. For example, they can be used for scams.

Kategorie: Hacking & Security

Apple doubles down on privacy after Siri-snooping settlement

9 Leden, 2025 - 17:51

Apple has vehemently denied that it ever abused recordings of Siri requests by using those records for marketing, ad sales, or any of the other creepy nonsense we’re being forced to tolerate with other connected devices.

The company’s denial follows a recent $95 million settlement concerning a widely reported sequence of events when it became known that the company had human contractors grading people’s spoken Siri requests. Many of us were extremely shocked at the nature what was being recorded and shared with those contractors, and to be fair, Apple swiftly took steps to remedy the situation, which it said was necessary to improve Siri’s accuracy.

The plaintiffs claimed that Apple’s systems had been used to trigger ads targeted at them, which Apple denied despite having settled the case. It’s thought the company chose to settle because it wanted to prevent further accusations against its commitments to privacy.

An unforced error with big consequences

The company has always denied that it abused the Siri request records in any way and has constantly pointed out that the recordings were not directly connected to any individual user, which is very unlike the experience you get with other connected devices. That denial wasn’t enough in this case. 

That’s because devices that lack Apple’s commitment to privacy are the ones responsible for ads you might encounter that spookily reflect private conversations you may have had. Apple says its systems don’t do that. 

Some companies deny they do this, but the fact others continue to do so leaves most of us deeply uncomfortable, and erodes trust.

In a statement following the resolution of the lawsuit, an Apple spokesperson said: “Apple has never used Siri data to build marketing profiles, never made it available for advertising, and never sold it to anyone for any purpose. Privacy is a foundational part of the design process, driven by principles that include data minimization, on-device intelligence, transparency and control, and strong security protections that work together to provide users with incredible experiences and peace of mind.”

Apple’s track record is a good one

Apple has committed vast resources to creating privacy protections across its systems. Everything from Lockdown mode to tools to prevent aggressive ad targeting and device fingerprinting represents the extent of its efforts, work that touches almost every part of the company’s ecosystem.

A future looming problem, of course, is that while Apple might be keeping to its pro-privacy promise, not every third-party developer likely shares the same commitment, despite the Privacy Labelling scheme the company has in place at the App Store.

This might become an even bigger problem as Apple is forced to open up to third-party stores. It seems plausible to expect some popular apps sold via those stores might choose to gather user data for profit.

With that monster visible on the horizon, Apple has also confirmed that it has teams working to build new technologies that will enhance Siri’s privacy. It also said, “Apple does not retain audio recordings of Siri interactions unless users explicitly opt in to help improve Siri, and even then, the recordings are used solely for that purpose.”

How Apple already protects Siri privacy

Apple pointed to several protections it already has in place for Siri requests:

  • Siri is designed to do as much processing as possible right on a user’s device — though some requests require external help, many, such as search suggestions, do not.
  • Siri searches and requests are not associated with your Apple Account. 
  • Apple does not retain audio recordings of Siri interactions unless users explicitly opt in to help improve Siri.

Apple has another protection it is putting into place: Private Cloud Compute. This will mean that Apple Intelligence requests made through Siri are directed to Apple’s cloud servers, which offer industry-leading security. “When Siri uses Private Cloud Compute, a user’s data is not stored or made accessible to Apple, and Private Cloud Compute only uses their data to fulfil the request,” the company said.

To some degree, the need to make these statements is a problem Apple foolishly created for itself in the way it initially handled Siri request grading. The manner in which that was done tarnished its reputation for privacy, which is unfortunate given the company knows very well that in the current environment digital privacy is something that must be fought for.

There is a silver lining to the clouded sky. That Apple is now making these statements means it can once again raise privacy as a consideration as we move through the next chapters of AI-driven digital transformation.

All the same, raising the conversation does not in any way guarantee that privacy will win the debate, despite how utterly essential it is to business and personal users in this digital, connected era.

You can follow me on social media! You’ll find me on BlueSky,  LinkedInMastodon, and MeWe

Kategorie: Hacking & Security

So you want to manage Apple devices without using MDM? Here’s how.

9 Leden, 2025 - 12:00

Recently, I was asked a question I haven’t heard in several years: Can you manage Apple devices without using MDM?

The technical answer is yes. You can use configuration profiles and Apple Configurator to do this.

But you really shouldn’t try that approach. With mobile device management (MDM) vendors licensing their software for as little as $1 per device or user per month, MDM should be the go-to option for all but those on the tiniest of shoestring budgets. (There’s also the possibility of using Apple Business Essentials, a stripped down solution from Apple intended for small organizations.) 

MDM and Apple Business Manager (or Apple Business Essentials) allow for zero-touch deployment. IT does not even have to see a device; it can be shipped new in the box to an employee and it will automatically configure and enroll in MDM when querying Apple’s activation servers during startup.

By contrast, managing devices manually can be extremely time consuming because you have to set up each device by hand when installing configuration profiles — and you must touch it every time you need to make changes. Security updates (or any software updates) cannot be forced to install, leaving it up to each user to install them or not. 

When a device is managed via MDM, there’s a constant back and forth communication between the device and your company’s MDM service. This allows a whole host of features, particularly security features such as being able to query the device status, lock/unlock the device, install software updates, and add applications and other content over the air. 

You also gain the ability to securely separate work and personal use of a device and to make use of managed Apple Accounts rather than relying on a user’s personal Apple account. 

Managed Apple Accounts perform the same function as personal Apple IDs, but they’re owned by an organization rather than the end user and they link to an employee’s work-related accounts. They can also be managed in a way that allows users access Continuity features at work and provides a work-related iCloud account. One big advantage here is that work related passwords and passkeys can sync across all of a user’s work devices (and they can be automatically removed from a device if a worker leaves the organization. 

Another consideration to keep in mind if you’re a small shop looking to save a few dollars is that you might not always be small. You may not think you need the features that come with MDM solutions, but as your company grows, your needs will change — and you’ll likely have to go through the headache of migrating away from manual management anyway.

This is the part where I tell you to turn back from trying to manage Apple devices manually. 

But if you’re truly determined to go it without using MDM or you’re really that cash strapped and you have a small number of employees and devices, here’s what you need to know. (Just don’t say you weren’t warned if you go this route and run into problems or security breaches.)

The basic component for managing devices is the configuration profile; it’s an XML file that specifies the various options you want to set up. These profiles have been around since the iPhone 3G launched in 2008 (two years before MDM even existed). These files also underpin MDM configuration, but you get a much broader selection of configuration options and an easier interface via MDM.

Apple Configurator for Mac is a free tool available in the App Store. There is an iPhone version as well that’s used to enroll devices if they’re not eligible for zero-touch deployment — typically, devices bought outside of a business purchase from Apple or an authorized reseller. (The Mac version can also be used for this purpose.)

The latest version of Apple Configurator supports the management of iPhones, iPads and Apple TVs, but — cautionary alert — it does not support managing Macs. (This is another downside to manual device management.)

Apple Configurator allows you to create a blueprint for various device types and to create configuration profiles with a simple-to-use GUI. You can then assign your profiles to blueprints. Configurator also lets you prepare devices to receive configuration profiles; backup and restore devices; determine whether they will work using Apple’s Supervision functions, which provide some additional control over devices; and to install apps. 

Once you’ve set up blueprints and added configuration profiles and apps, you’ll need to connect each device via a USB-to-Lightening cable (for older devices) or with a USB-C cable (for newer devices) and then assign the device to a blueprint. When preparing a device for Apple Configurator, you can choose to remove various steps in Setup Assistant (just as in MDM). You can also set the device name, wallpaper, and home screen layout. 

Managing Macs works essentially the same way — by building configuration profiles. But you need to hand install them on each Mac. Depending on the payload of the profile and whether a user has local admin privileges, the Mac user might be able to delete installed configuration profiles. Keep that in mind.

Apple Configurator can also be used to revive or restore the firmware of Apple devices (including Macs).

Apple provides a user guide that offers additional details and a walk-through of tasks in Apple Configurator.

So, as I noted from the very start, you can see that it’s certainly possible to manage Apple devices manually. But hopefully, you can also now see that there are too many advantages to managing devices using MDM (or Apple Business Essentials) to do it the old-school way. 

From better security to a lighter IT workload and an improved user experience, MDM really can streamline everything needed to keep your fleet of Apple devices up and running.

Kategorie: Hacking & Security

How AI will shape work in 2025 — and what companies should do now

9 Leden, 2025 - 12:00

AlphaSense is a market intelligence platform that uses generative artificial intelligence (genAI) and natural language processing to help organizations find and analyze insights from sources like financial reports, news, earnings calls, and proprietary documents.

The purpose behind the platform is to allow business professionals to access relevant insights and make data-driven decisions.

Sarah Hoffman, director of AI research at AlphaSense, is an IT strategist and futurist. Formerly vice president of AI and Machine Learning Research at Fidelity Investments, Hoffman spoke with Computerworld about how AI will change the future of work and how companies should approach rolling out the fast-moving technology over the next several years.

In particular, she talked about how the arrival of genAI tools in business will allow workers to move away from repetitive jobs and into more creative endeavors — as long as they learn how to use the new tools and even collaborate with them. What will emerge is a “symbiotic” relationship with an increasingly “proactive” technology that will require employees to constantly learn new skills and adapt.

How will AI shape the future of work, in terms of both innovation and new workforce dynamics? “AI can manage repetitive tasks, or even difficult tasks that are specific in nature, while humans can focus on innovative and strategic initiatives that drive revenue growth and improve overall business performance. AI is also much quicker than humans could possibly be, is available 24/7, and can be scaled to handle increasing workloads.

“As AI automates more processes, the role of workers will shift. Jobs focused on repetitive tasks may decline, but new roles will emerge, requiring employees to focus on overseeing AI systems, handling exceptions, and performing creative or strategic functions that AI cannot easily replicate.

“The future workforce will likely collaborate more closely with AI tools. For example, marketers are already using AI to create more personalized content, and coders are leveraging AI-powered code copilots. The workforce will need to adapt to working alongside AI, figuring out how to make the most of human strengths and AI’s capabilities.

“AI can also be a brainstorming partner for professionals, enhancing creativity by generating new ideas and providing insights from vast datasets. Human roles will increasingly focus on strategic thinking, decision-making, and emotional intelligence. AI will act as a tool to enhance human capabilities rather than replace them, leading to a more symbiotic relationship between workers and technology. This transformation will require continuous upskilling and a rethinking of how work is organized and executed.

Why is Gen Z’s adoption of AI a signal for broader trends in business technology? “Gen Z, having grown up in a highly digital environment, is naturally more comfortable with technologies like AI. Their rapid adoption of AI tools highlights a shift towards technology-first thinking. As this generation excels in the workforce, their familiarity with AI will drive its integration into business processes, pushing companies to adopt and adapt to AI-driven solutions more quickly.

“Gen Z’s use of AI also reflects the broader understanding that AI complements human skills rather than replaces them. As businesses increasingly adopt AI, they will need to recognize the importance of training employees to work alongside AI, ensuring that AI becomes a valuable tool that enhances human creativity and strategic thinking.”

Sarah Hoffman

AlphaSense

What is AI’s role in business teams and how can companies best leverage it to enhance human skills and knowledge? “AI’s role in teams is to act as a tool that enhances human capabilities rather than [as] a complete replacement for human decision-making. Professionals can use AI to streamline routine tasks, such as data analysis and trend identification, which frees up time for more strategic and creative work. Additionally, AI can accelerate learning and innovation by synthesizing complex data, identifying new perspectives, and providing personalized insights.

“To best leverage AI to enhance human skills and knowledge, companies should:

  • Define AI’s role clearly and establish specific tasks for AI, such as data processing or generating insights, and use it as a tool to support human judgment and decision-making.
  • Regularly check AI’s outputs for accuracy and reliability to ensure its recommendations align with human expertise.
  • Train teams effectively with the knowledge of when to trust AI’s recommendations and, importantly, when to rely on their own judgment and expertise.
  • Enable effective collaboration between AI tools and humans. AI should complement human intelligence, helping teams work more efficiently, creatively, and strategically.”

What should companies prioritize to harness AI for long-term success? “Before companies can leverage this powerful technology and the business opportunities that come with it, they must consider the common pitfalls. Companies can build a proprietary system that may be the best fit for their customers or they can leverage third-party partnerships to mitigate the initial cost of building an AI system from the ground up. This is a pivotal decision that impacts future success and longevity. And the answer doesn’t have to be just build or buy; often a hybrid solution can make sense too, depending on the use cases involved.

“Companies should focus on long-term strategy, quality data, clear objectives, and careful integration into existing systems. Start small, scale gradually, and build a dedicated team to implement, manage, and optimize AI solutions. It’s also important to invest in employee training to ensure the workforce is prepared to use AI systems effectively.

“Business leaders also need to understand how their data is organized and scattered across the business. It may take time to reorganize existing data silos and pinpoint the priority datasets. To create or effectively implement well-trained models, businesses need to ensure their data is organized and prioritized correctly.  

“It’s crucial to have alignment across teams to create a successful AI program. This includes developers, data analysts and scientists, AI architects and researchers and other critical roles that decide the overall business goals and objectives. These teams must work together closely to ensure there is consistency across development, product, marketing, etc.  

“Another critical aspect for companies to consider is the end user. For AI to deliver long-term success, businesses must prioritize understanding the needs and expectations of those who will interact with or benefit from the technology. This involves gathering feedback from end-users throughout the development and implementation process to ensure the solutions being built provide real value. 

“By focusing on these priorities, companies can ensure their workforce is prepared and AI programs are highly effective and ethically sound, positioning themselves for long-term success.”

What are some of the biggest advances you see happening with AI this year? “In 2025, generative AI will transition from its experimental phase to mainstream, product-ready applications across industries. Customer service automation, personalized content creation, and knowledge management are expected to lead this evolution.

“As more production-ready solutions are deployed, companies will refine methods to quantify AI’s impact, moving beyond time savings to include metrics like customer satisfaction, revenue growth, enhanced decision-making, and competitive advantage. These advancements will help executives make more informed investment decisions, accelerating generative AI adoption across industries.

“Generative AI systems will also become significantly more proactive, evolving beyond the passive ‘question-and-answer’ model to intelligently anticipate users’ needs. By leveraging a deep understanding of user habits, preferences, and contexts, these systems could predict and provide relevant information, assistance, or actions at the right moment. Acting as intelligent agents, they may even begin autonomously handling simple tasks with minimal input, further enhancing their utility and integration into everyday workflows.”

For what purposes do you see generative AI moving from pilot to production next year? “The leap from pilot projects to full-scale deployment is the next critical step for generative AI in 2025. While 2024 saw companies experiment with AI for efficiency — such as automating customer service queries or creating personalized content — these applications are expected to mature and deliver measurable business outcomes. As companies refine their data pipelines and AI infrastructure, these tools will likely become integral to daily operations rather than isolated experiments. 

“Beyond efficiency, there’s a growing interest in leveraging AI for strategic innovation. For example, businesses may use generative AI to prototype new products, model market scenarios, or enhance customer experiences. These strategic applications could reshape industries by fostering innovation, increasing competitive advantage, and driving revenue growth.”

This past year, many organizations seemed to struggle with cleaning their data in order to prepare it for use by AI. Why do you believe that’s still necessary?  “Data cleaning remains essential for ensuring AI reliability, even as models become more advanced. Generative AI systems depend on high-quality, consistent data to produce accurate results. Poorly prepared data can lead to biased outputs, reduced performance, and even legal risks in sensitive applications. By standardizing, de-duplicating, and enriching datasets, organizations ensure their AI systems are well-equipped to handle real-world complexity.”

How should companies go about ensuring the responses they get from genAI are accurate? “To ensure the accuracy of generative AI, businesses must employ rigorous testing and validation methods. Models should be evaluated against real-world datasets and specific benchmarks to confirm their reliability.

“Many companies are turning to retrieval-augmented generation (RAG), using domain-specific trusted and citable data to mitigate the risk of misinformation. This approach is particularly critical for applications like healthcare or financial decision-making where errors can have serious consequences. Similarly, in such high stakes functions, human oversight is essential.”

Companies that have deployed AI have used multiple models, but how do you create pipelines between those models and businesses for strategic purposes? “Rather than relying on a single provider, companies are adopting a multi-model approach, often deploying three or more AI models, routing to different models based on the use case. Continuous monitoring is necessary to ensure the models perform optimally, maintain accuracy, and adapt to changing business needs. “

Do you see smaller language models or the more typical large language models dominating in 2025 and why? “In 2025, the choice between smaller language models and large language models will ultimately depend on specific use cases. SLMs are invaluable for specified, narrow tasks that have use-case specific constraints around security, cost and latency. SLMs can be faster and cheaper to operate and can be deeply customized for domain workflows. For example, AlphaSense uses SLMs for earnings call summarization. Another advantage of SLMs is that they can be run on-device, which is critical for many mobile applications leveraging sensitive, personal data.

“LLMs, on the other hand, will dominate in general-purpose and complex applications requiring high-level reasoning, adaptability, and creativity. Their expansive knowledge and versatility make them essential for advanced research, multimodal content generation, and other sophisticated use cases. A hybrid approach will likely define the AI landscape in 2025, combining the efficiency of SLMs with the versatility of LLMs, enabling businesses to optimize performance, cost, and scalability.”

Kategorie: Hacking & Security

Google faces privacy lawsuit as judge highlights data governance concerns

9 Leden, 2025 - 09:50

A federal judge in San Francisco has ruled that a privacy lawsuit against Google, alleging the company improperly collected personal data from mobile devices, can proceed.

Chief Judge Richard Seeborg dismissed Google’s argument that it had sufficiently informed users about its Web & App Activity settings and obtained their consent for tracking, paving the way for a possible trial in August.

The lawsuit accuses Google of intercepting and saving browsing histories without user consent, even after tracking settings were disabled.

Judge Seeborg noted that reasonable users could view Google’s data practices as “highly offensive,” given the ambiguity in its disclosures and internal employee concerns about how the settings were communicated.

“Internal Google communications also indicate that Google knew it was being ‘intentionally vague’ about the technical distinction between data collected within a Google account and that which is collected outside of it because the truth ‘could sound alarming to users’,” Seeborg wrote.

The Judge noted that Google defended its practices by downplaying internal employee comments cited in the lawsuit, arguing they were focused on identifying technical improvements rather than raising privacy concerns. The company also said that some employees involved in these discussions lacked familiarity with the Web & App Activity (WAA) settings.

“The concerns raised by Google employees are relevant, however, at the very least for tending to show that the WAA disclosures are subject to multiple interpretations,” Seeborg added. “What is more, the remarks and Google’s internal statements reflect a conscious decision to keep the WAA disclosures vague, which could suggest that Google acted in a highly offensive manner, thereby satisfying the intent element of the tort claim.” 

Broader implications of the case

The legal battle against Google could have far-reaching implications for enterprise data governance, particularly in how companies handle user consent and transparency.

The case raises questions about whether current data collection practices align with user expectations and legal requirements, especially in an era where trust in technology firms is under heightened scrutiny.

“Enterprise data policies have typically assumed that vendors are not saving personal information unless there is some sort of opt-in policy,” said Hyoun Park, CEO and chief analyst at Amalgam Insights. “In particular, the argument that data capture ‘doesn’t hurt anyone’ is a red herring compared to the actual requirement for governance.”

However, while Google’s defense focuses on its own practices, the outcome of the case could drive the industry toward better transparency and accountability.

“Google obviously has to defend its actions and perspective, but my hope is that this finding leads to greater transparency,” Park added. “Obviously, one of the challenges of any complex data service, such as Google or Microsoft or Amazon, is the complexity of governance and administration associated with the data environment and the complicity of tracking the data and activity associated with any sort of service.”

The rise of artificial intelligence is adding complexity to data governance and privacy issues.

While the case focuses on the straightforward matter of capturing personal browsing data, the broader challenge lies in managing the tracking and governance of data-related activities, Park added.

Google’s legal woes continue

Google faces mounting legal challenges as scrutiny over its data practices and market dominance intensifies.

In August 2024, a US District Court ruled Google held a monopoly in online search, accusing the tech giant of using its market dominance to stifle competition.

In September, the European Union’s Data Protection Commission opened an inquiry into the company’s use of personal data.

However, analysts say Google may be able to limit reputational damage and bolster its standing with corporate clients with efforts to enhance privacy measures.

“This case underscores the growing scrutiny of Big Tech’s data practices and the increasing demand for transparency,” said Thomas George, president of Cybermedia Research. “How Google and other major tech companies respond to these expectations remains to be seen, as they strive to balance competitiveness with maintaining the trust of users and partners.”

Google has not responded to requests for comment.

Kategorie: Hacking & Security

Eset: upgrade from Windows 10 to 11 to avoid ‘security fiasco’

8 Leden, 2025 - 19:57

Cybersecurity company Eset is now urging Windows 10 users to upgrade to Windows 11 or another operating system well in advance of Oct. 14, 2025, when support for Windows 10 ends.

into”It’s about five to twelve minutes to avoid a security fiasco in 2025,” Eset security expert Thorsten Urbanski said, according to Bleeping Computer .

Eset estimates there are around 32 million computers still running Windows 10 in Germany alone, roughly 65% of all devices in the country. Windows 11 runs on 16.5 million devices, corresponding to approximately 33%. According to Statcounter, global figures for Windows 10 and 11 use are similar.

Many Windows 10 users have not upgraded because of Windows 11’s higher hardware requirements, which make it inaccessible for older computers.

“The situation is much more dangerous than when support for Windows 7 ended in 2020,” said Urbanski.

Kategorie: Hacking & Security

7+ speedy steps to free up space on your Mac

8 Leden, 2025 - 19:35

Computers get clogged with digital “stuff” over time, and while we all like to think we’re good at managing all that D-detritus, there’s somehow never quite enough time to clean things up. If you’re new to the Mac, or even if you’ve used an Apple computer for decades, you need to learn these tips to prune the trash. But first, open the Finder item in the Menu and choose “Empty Trash.”

You’d be surprised how many Mac users forget to do so regularly.

[Related: 11 tips for speeding up your Mac ]

Check your storage

Your Mac has a really excellent storage management system that is available in System Settings in the General tab. (This is also available via the Apple menu, About this Mac, More info). Open that tab and then select Storage. Your Mac will have a little think and reward you with a nice graphic that shows you what is taking up most space on your machine.

This information is divided across numerous sections:

  • Applications
  • Bin
  • Books
  • Documents
  • iCloud Drive
  • Mail
  • Messages
  • Music
  • Photos
  • Podcasts
  • TV
  • Other Users & Shared
  • macOS
  • System Data

Now that you’ve got a bird’s eye view of your storage, you can begin to get rid of some of the clutter.

Use the Recommendations

Apple has built a system to help you delete some of the most commonly accumulated stuff, which it makes available as Recommendations. These Recommendations usually appear at the top of the list of stored media, just beneath the image in Storage. You will not see these if you have already followed them, but if you do these may include:

  • Store in iCloud: This stores all your Desktop and Documents files in the cloud and only keeps recent files locally available on your Mac. The too will also store messages, attachments, photos, and videos for you. This maximizes storage space. 
  • Optimize Storage: This tool automatically removes movies and TV shows sourced from Apple from your Mac, though you can still download them again.
  • Empty Trash automatically: This tool is recommended as it will automatically erase anything that has been in the Trash for over 30-days.
Open your ‘I’s

Take a look at the above and you’ll find that each section has a small I beside it. Tap this and you’ll get more information to help you manage each of those sections. Tap the I icon for Applications, for example, and you’ll find all those you have installed; you should delete all those you no longer use. if you find any software you don’t need, you can select it in this view and hit Delete to get rid of it, freeing up a little space.

It’s good to take a look inside each category, particularly Messages, where you can delete some of these huge attachments you might not realize you have stored on your device.

What about your Downloads folder?

When did you last take a look inside your Downloads folder? Open it now. (Go>Downloads in the menu bar). Most Mac users find they have lots of items stored there, many of which might still be important. You can free up huge quantities of space on your Mac by going through what you have stored in the folder, filing important items in relevant folders on your Mac, and deleting the rest. Of course, the easiest way to review all those items is to view the files as a list using View>As List. 

Manage all your largest files

Here’s a way to quickly review all the largest files you have stashed on your Mac. Let’s create a Smart Folder to monitor for larger files. 

  • In the Finder Menu choose New Smart Folder.
  • A “New Smart Folder” window appears. You’ll see an option to search “This Mac.” Select that.
  • Look to the left along the row and you’ll see a Save command (which we will use later). You will also see a Plus (+) button. Tap this.
  • A set of choices comes up. The first defaults to Kind. Tap this to access a drop down menu where you should tap “Other.”
  • A long list appears; the one you want is File Size, which you should check.
  • Once you do so, you’ll be able to select it in the drop down list to replace Kind.
  • In the next item on the row, you’ll get to choose a parameter. I suggest you use “is greater than.”
  • Two more choices appear in the row; the first lets you set a number — try 100. The second lets you define a size — try MB.
  • You will immediately see every file on your Mac that is larger than 100MB. You can delete any of these items by control-clicking them and choosing Move to Bin. But be certain not to delete any System files, as doing so may damage your system. In general, it’s a good rule not to delete anything you do not recognize.

Now you have this bird’s eye view into large items on your Mac you can Save it for future use. 

  • Return to the original Row you first looked at, and tap Save.
  • Give the search a name, such as “Large Files.”
  • By default, the search saves in Saved Searches, which is as good a place as any.
  • Also by default, the search can be added to the sidebar — just make sure this option is ticked.

In the future, you’ll find your new “Large Files” search is available to you in Favorites from within the Finder sidebar, making it super easy to swiftly identify any space invaders you still have on your system.

Take a look in Mail

Your email application is full of stuff. All those Mail attachments mount up over the years, and while you need to keep some of them some of the time, you probably don’t need to retain all of them forever. The best practice is to delete attachments in emails you no longer need; you can do this by deleting the message itself or selecting a message and choosing Remove Attachments in the Messages menu. 

You can also create a search in Mail that lets you identify emails containing attachments. Try Mailbox>New Smart Mailbox, select “contains attachments” and save. This is a very unsophisticated tool that just makes it easier for you to monitor any emails you might have received that contain attachments, though it still makes for a very manual process. This is actually the problem with Mail: it doesn’t let you easily manage emails containing large attachments. It does let you do one more thing, however, which you should do now: Open Mailbox and choose Erase Junk Mail to get rid of all the junk that has accumulated. You should also select Erase Deleted Items.

Run Onyx or CleanMyMac

There are numerous applications that claim to help you free up and better manage space on your Mac. I like the free Onyx application, which has been my go-to troubleshooting solution for years. But many users also like MacPaw’s CleanMyMac application. What these applications do is make it possible to delete data you can’t easily or safely get to on your Mac, including unwanted database files, bloated logs, and more. Apple says that macOS will automatically clear such data — including temporary database files, interrupted downloads, staged macOS and app updates, Safari website data, and more — when space is needed on your Mac. But some users might prefer to be proactive.

  • With Onyx, install the software, open Maintenance and select and run the Cleaning options available there.
  • Using CleanMyMac, run the Cleanup routine, which will scan your Mac to present you with a selection of choices of what to clean.

What both applications do is force the Mac to run tasks it should do automatically.

Delete old user profiles

If you are using a shared Mac it is likely it will also be a managed Mac, in which case the following option might not be available as it may be managed on your behalf by IT. The problem this solves is that each user on a Mac gets its own user profile which contains all the data and documents that relate to that user. That’s fine when everyone is actively using the Mac, but when someone stops using the machine it becomes necessary to delete their profile to free up the space – though they should get the data they need off the Mac before you do.

To delete an unwanted User profile open System Settings>Users & Groups. If you see the word Admin under your name you will be able to follow the rest of these steps, once you click the lock icon and enter the password. Then choose the user you intend to delete and click Delete User by clicking the – (minus) button. 

Three options appear:

  • Save the home folder in a disk image: All the information will be archived for potential restore,
  • Don’t change the home folder: Everything is left in place and the user can be restored.
  • Delete the home folder: Everything is deleted.

Before choosing the third option, it’s incredibly important to ensure you have the right to delete the user.

If you have additional suggestions, please let me know.

You can follow me on social media! Join me on BlueSky,  LinkedInMastodon, and MeWe

Kategorie: Hacking & Security

Website certificates that expire every six weeks? What IT should know

8 Leden, 2025 - 12:00

Industry forces — led by Apple and Google — are pushing for a sharp acceleration of how often website certificates must be updated, but the stated security reason is raising an awful lot of eyebrows.

Website certificates, also known as SSL/TLS certificates, use public-key cryptography to authenticate websites to web browsers. Issued by trusted certification authorities (CAs) that verify the ownership of web addresses, site certificates were originally valid for eight to ten years. That window dropped to five years in 2012 and has gradually stepped down to 398 days today.

The two leading browser makers, among others, have continued to advocate for a much faster update cadence. In 2023, Google called for site certificates that are valid for no more than 90 days, and in late 2024, Apple submitted a proposal to the Certification Authority Browser Forum (CA/Browser Forum) to have certificates expire in 47 days by March 15, 2028. (Different versions of the proposal have referenced 45 days, so it’s often referred to as the 45-day proposal.)

If the CA/Browser Forum adopts Apple’s proposal, IT departments that currently update their company’s site certificates once a year will have to do so approximately every six weeks, an eightfold increase. Even Google’s more modest 90-day proposal would multiply IT’s workload by four. Here’s what companies need to know to prepare.

Why the push for shorter SSL certificate lifespans?

The official reason for speeding up the certificate renewal cycle is to make it far harder for cyberthieves to leverage what are known as orphaned domain names to fuel phishing and other cons to steal data and credentials.

Orphaned domain names come about when an enterprise pays to reserve a variety of domain names and then forgets about them. For example, Nabisco might think up a bunch of names for cereals that it might launch next year — or Pfizer might do the same with various possible drug names — and then eight managerial meetings later, all but two of the names are discarded because those products will not be launching. How often does someone bother to relinquish those no-longer-needed domain names?

Even worse, most domain name registrars have no mechanism to surrender an already-paid-for name. The registrar just tells the company, “Make sure it’s not auto-renewed, and then don’t renew it later.”

When bad guys find those abandoned sites, they can grab them and try and use them for illegal purposes. Therefore, the argument goes, the shorter the timeframe when those site certificates are valid, the less of a security threat it poses. That is one of those arguments that seems entirely reasonable on a whiteboard, but it doesn’t reflect reality in the field.

Shortening the timeframe might lessen those attacks, but only if the timeframe is so short it denies the attackers sufficient time to do their evil. And, some security specialists argue, 47 days is still plenty of time. Therefore, those attacks are unlikely to be materially reduced.

“I don’t think it is going to solve the problem that they think is going to be solved — or at least that they have advertised it is going to solve,” said Jon Nelson, the principal advisory director for security and privacy at the Info-Tech Research Group. “Forty-seven days is a world of time for me as a bad guy to do whatever I want to do with that compromised certificate.”

Himanshu Anand, a researcher at security vendor c/side, agreed: “If a bad actor manages to get their hands on a script, they can still very likely find a buyer for it on the dark web over a period of 45 days.”

That is why Anand is advocating for even more frequent updates. “In seven days, the amount of coordination required to transfer and establish a worthy man-in-the-middle attack would make it a lot tighter and tougher for bad actors.”

But Nelson questions whether expired domain stealing is even a material concern for enterprises today.

“Of all of the people I talk with, I don’t think I have talked with a single one that has had an incident dealing with a compromised certificate,” Nelson said. “This isn’t one of the top ten problems that needs to be solved.”

That opinion is shared by Alex Lanstein, the CTO of security vendor StrikeReady. “I don’t want to say that this is a solution in search of a problem, but abusing website certs — this is a rare problem,” Lanstein said. “The number of times when an attacker has stolen a cert and used it to impersonate a stolen domain” is small.

Getting a handle on faster site certificate updates

Nevertheless, it seems clear that sharply accelerated certificate expiration dates are coming. And that will place a dramatically larger burden on IT departments and almost certainly force them to adopt automation. Indeed, Nelson argues that it’s mostly an effort for vendors to make money by selling their automation tools.

“It’s a cash grab by those tool makers to force people to buy their technology. [IT departments] can handle their PKI [Public Key Infrastructure] internally, and it’s not an especially heavy lift,” Nelson said.

But it becomes a much bigger burden when it has to be done every few months or weeks. In a nutshell, renewing a certificate manually requires the site owner to acquire the updated certificate data from the certification authority and transmit it to the hosting company, but the exact process varies depending on the CA, the specific level of certificate purchased, the rules of the hosting/cloud environment, the location of the host, and numerous other variables. The number of certificates an enterprise must renew ranges widely depending on the nature of the business and other circumstances.

C/side’s Anand predicted that a 45-day update cycle will prove to be “enough of a pain for IT to move away from legacy — read: manual — methods of handling scripts, which would allow for faster handling in the future.”

Automation can either be handled by third parties such as certificate lifecycle management (CLM) vendors, many of which are also CAs and members of the CA/Browser Forum, or it can be created in-house. The third-party approach can be configured numerous ways, but many involve granting that vendor some level of privileged access to enterprise systems — which is something that can be unnerving following the summer 2024 CrowdStrike situation, when a software update by the vendor brought down 8.5 million Windows PCs around the world. Still, that was an extreme example, given that CrowdStrike had access to the most sensitive area of any system: the kernel.

The $12 billion publisher Hearst is likely going to deal with the certificate change by allowing some external automation, but the company will build virtual fences around the automation software to maintain strict control, said Hearst CIO Atti Riazi.

“Larger, more mature organizations have the luxury of resources to place controls around these external entities. And so there can be a more sensible approach to the issue of how much unchecked automation is to exist, along with how much access the third parties are given,” Riazi said. “There will most likely be a proxy model that can be built where a middle ground is accessed from the outside, but the true endpoints are untouched by third parties.”

The certificate problem is not all that different from other technology challenges, she added. 

“The issue exemplifies the reality of dealing with risk versus benefit. Organizational maturity, size, and security posture will play great roles in this issue. But the reality of certificates is not going away anytime soon,” Riazi said. “That is similar to saying we should all be at a passwordless stage by this point, but how many entities are truly passwordless yet?”

What happens when a website certificate expires?

There is a partially misleading term often used when discussing certificate expiration. When a site certificate expires, the public-facing part of the site doesn’t literally crash. To the site owner, it can feel like a crash, but it isn’t.

What happens is that there is an immediate plunge in traffic. Some visitors — depending on the security settings of their employer — may be fully blocked from visiting a site that has an expired certificate. For most visitors, though, their browser will simply flag that the certificate has expired and warn them that it’s dangerous to proceed without actually blocking them.

But Tim Callan, chief compliance officer at CLM vendor Sectigo and vice chair elect of the CA/Browser Forum, argues that site visitors “almost never navigate past the roadblock. It’s very foreboding.”

That said, an expired certificate can sometimes deliver true outages, because the certificate is also powering internal server-to-server interactions. 

“The majority of certs are not powering human-facing websites; they are indeed powering those server-to-server interactions,” Callan said. “Most of the time, that is what the outage really is: systems stop.” In the worst scenarios, “server A stops talking to server B and you have a cascading failure.”

Either way, an expired certificate means that most site visitors won’t get to the site, so keeping certificates up to date is crucial. With a faster update cadence on the horizon, the time to make new plans for maintaining certificates is now.

All that said, IT departments may have some breathing room. StrikeReady’s Lanstein thinks the certification changes may not come as quickly or be as extreme as those outlined in Apple’s recent proposal.

“There is zero chance the 45 days will happen” by 2028, he said. “Google has been threatening to do the six-month thing for like five years. They will preannounce that they’re going to do something, and then in 2026, I guarantee that they will delay it. Not indefinitely, though.”

C/side’s Anand also noted that, for many enterprises, the certificate-maintenance process is multiple steps removed.

“Most modern public-facing platforms operate behind proxies such as Cloudflare, Fastly, or Akamai, or use front-end hosting providers like Netlify, Firebase, and Shopify,” Anand said. “Alternatively, many host on cloud platforms like AWS [Amazon Web Services], [Microsoft] Azure, or GCP [Google Cloud Platform], all of which offer automated certificate management. As a result, modern solutions significantly reduce or eliminate the manual effort required by IT teams.”

Also by Evan Schumann:

>
Kategorie: Hacking & Security

6 fast ways to free up space on your Windows PC

8 Leden, 2025 - 12:00

Want to free up space on your computer? With the right tools, you can quickly eliminate gigabytes of unnecessary files and get back to work — or whatever else you use your computer for.

I’ve long argued you don’t need a “PC cleaner” app. You mostly just need the tools built right into Windows. They’ll do the job, whether you’re using a workplace PC or a home computer.

But I do have some useful free downloads to recommend that might speed things up.

Get more Windows knowledge with my free Windows Intelligence newsletter. Plus, get free Windows Field Guide downloads as a special welcome bonus!

Windows space-freeing step #1: Use Disk Cleanup

First things first: The classic “Disk Cleanup” tool built into Windows is still the quickest way to free up space. If your PC recently installed a big Windows update, you might be surprised to see that this tool can free up more than 10GB of storage space in just a few clicks.

To launch it, open the Start menu, search for “Disk Cleanup,” and click the “Disk Cleanup” shortcut.

When it opens, select your C: drive, and click “OK.” After it finishes a quick scan, click the “Clean up system files” button and select your C: drive once again. 

You’ll see how much space you can free up here. Look through the list and check whatever you want to remove.

You generally shouldn’t run into any issues with removing most of this stuff. However, watch out for the “Recycle Bin” option — if you check this, Windows will empty your Recycle Bin, and you won’t be able to recover files in it. Also, watch out for the “Previous Windows installations” option. If you see this option here and check it, you’ll free up space — but you won’t be able to “roll back” to the previous Windows update if you ever experience a problem.

When you’re done, click “OK” — and you’ll see a good chunk of space used by temporary files and other clutter freed up for other use instantly.

The Disk Cleanup tool can often free up gigabytes of space in a few clicks.

Chris Hoffman, IDG

Windows space-freeing step #2: Think big

Some programs use quite a bit of space, without any of your own data even playing into the picture. To see just how much each installed application is using, open the Windows Settings app and select Apps > Installed Apps (on Windows 11) or Apps (on Windows 10.)

Tell Windows to sort the apps by “Size,” and it will show you the largest applications at the top of the list. You can uninstall applications from here to free up space.

Bear in mind that this isn’t perfect, though. Many applications don’t show storage space here at all — they’ll have a blank entry in the storage column, even though they may be using lots of space. But it’s still a smart place to start getting a general idea of what’s eating up space and where you can turn to free up precious room.

The storage space numbers in the Apps window aren’t always accurate, but they’re a good place to start.

Chris Hoffman, IDG

Windows space-freeing step #3: Lean on WizTree

With temporary files cleaned up and a few big applications uninstalled, the next best way to free up space is to see what’s actually using it. I recommend using a free application called WizTree. It’ll give you an overall “bird’s eye view” of exactly what’s using space on your computer.

Install WizTree, launch it, and click “Scan.” WizTree is very fast at scanning your drive — faster than other tools I’ve used and recommended in the past, including the classic WinDirStat.

Just be ready: WizTree will give you a lot of information. Use the visual view at the bottom of the window, though, and you can mouse over areas and see which folders and files are taking up the most room. Perhaps you’ll find an old backup folder you don’t need, or you could discover some apps are using more space than you expect. 

From there, you can make informed decisions about which files to move off your computer and which programs to uninstall.

A disk space analyzer is a must-have Windows PC utility.

Chris Hoffman, IDG

Windows space-freeing step #4: Turn to the cloud

There’s a good chance you have a lot of files in cloud file storage services like OneDrive, Google Drive, Dropbox, or iCloud Drive. That’s especially true with OneDrive, as OneDrive is built right into Windows, and you get 1TB of cloud storage if you pay for Microsoft 365.

Once you have all that stuff stored safely in such a remote location, there’s a reasonable argument that you no longer need it also taking up space on your own local PC (unless you simply want the redundancy as an extra fail-safe and backup, of course).

You can address this in several ways: First, you can hide certain folders so they don’t sync to your PC. For example, in OneDrive, right-click the OneDrive cloud icon in your system tray, select “Settings,” click the “Account” option, and then click “Choose folders.” You can then uncheck certain folders, and they will never sync to your PC.

OneDrive also downloads files “on demand” as you use them. You can right-click big files in File Explorer and select “Free up space” to save space on your computer. The next time you open that file, OneDrive will automatically redownload it — but it won’t exist in both places and take up room in the meantime. 

Windows space-freeing step #5: Seek out Storage Sense

On Windows 10 and 11, the “Storage Sense” interface is the more modern replacement for the Disk Cleanup tool. It offers a variety of unique features that can help you free up space.

You find it by opening the Settings app from your Start menu, clicking “System,” and then “Storage.”

Once you do, on Windows 11, you can select the “Cleanup recommendations” option under Storage Management to see things Windows recommends you remove. Beware: Windows will recommend you delete the contents of your Downloads folder! Depending on how you use that folder, you might not want to check that box.

To allow the service to automatically free up space in the background, meanwhile, click the “Storage Sense” option (on Windows 11) or “Configure storage sense or run it now” (on Windows 10). Use the options there to configure how you want Storage Sense to work. For example, you could have Storage Sense automatically empty your Recycle Bin and delete old files in your Downloads folder whenever your PC’s available disk space gets low.

The Storage Sense tool is powerful, but be careful if you use it to empty your Recycle Bin or Downloads folder.

Chris Hoffman, IDG

Windows space-freeing step #6: Delete duplicates

If you suspect you have duplicate files just wasting space on your PC, it’s a good idea to try a duplicate file-finding tool. You can pin down whether you have duplicate files — and exactly where they are.

I recommend using the classic (and free) dupeGuru tool to scan for duplicates. However, there are a variety of good duplicate file finders out there. Once you’ve identified the unnecessarily cloned files, you can decide what to do with them.

Other ways to get more space

If you’re still seeking digital breathing room after all of that, don’t give up! A little creative thinking about your specific setup can help you find more ways to save space.

For example, if you have a modern Copilot+ PC and are using the AI-powered Recall feature, you can control how much space Recall uses for snapshots from Recall’s settings.

If you’ve looked through all of your apps for similar space-saving opportunities and are still coming up short, the final answer is easy: You should consider getting more storage space. Life is too short to spend endless time micromanaging exactly what’s on your computer’s local storage!

On a desktop PC, you could just buy an external storage drive and plug it in — or upgrade the internal storage. Even on a laptop, you might even be able to insert a microSD card for some extra room for files without adding any extra bulk.

Want to make the most of your Windows PC? Sign up for my free Windows Intelligence newsletter. You’ll get three new things to try every Friday and free copies of Paul Thurrott’s Windows Field Guides as a special welcome gift.

Kategorie: Hacking & Security

6 swift steps for a faster Android experience

8 Leden, 2025 - 11:45

Well, I’ll be: It’s a new year! Already. Somehow. I think. (For full disclosure, I’m still at least 77% asleep from my traditional end-of-year hibernation/hiatus. Kindly forgive any mid-sentence snores or nonsensical outbursts.)

While most of the world is obsessing over Shiny New Stuff™ for 2025, though — with the avalanche of awkwardly overlapping announcements known as the annual Consumer Electronics Show, along with all the never-ending crowing over almost-functional AI flummery — personally, I like to think of the new year as a perfect opportunity to take stock of stuff you already have, give it all a good old-fashioned tune-up, and get it ready to work even better for you in the months ahead.

Especially with Android devices now being supported with current software for longer than ever, you’ve got every reason to think about your phone(s) and tablet(s) the same way you do your car(s). In both those arenas, a teensy touch of easy occasional maintenance goes an impressively long way in keeping your tech in tip-top shape.

And with Android, you really don’t need much. In fact, so long as your device hasn’t been involved in a metaphorical fender-bender, you don’t even need a mechanic — just about 20 minutes of time and the willingness to get your fingers a little greasy (metaphorically speaking) with some simple cobweb-clearing spruce-ups.

So pop open your hood, pull on the nearest pair of oil-stained coveralls, and get ready to get your hands (metaphorically) dirty: It’s time to step into the garage and get your mobile device back in fighting form for the coming year.

Make your way through the following six steps — and if you’re hungry for even more advanced Android awesomeness after that, check out my free Android Shortcut Supercourse to uncover tons of new time-saving tricks.

Step #1: Uninstall unnecessary apps ⏱ Time required: 3 minutes

This first step may sound silly, but believe you me, it can make a mountain of difference: Whether they’re apps that came preinstalled or programs you downloaded once upon a moon, there’s a decent chance you’ve got at least some unused items lurking in the mustier corners of your favorite Android gizmo. And guess what? Those forgotten icons do more than just collect virtual dust. They actively work against your need for Android-oriented speed.

First, superfluous apps take up space — both in the physical sense of your phone’s internal storage and in the sense of clutter that makes it tougher to find what you actually want at any given moment. But beyond that, abandoned apps often take a toll on a phone’s performance and stamina by needlessly eating up resources. And beyond that, they also open the door to some easily avoidable privacy compromises.

You can probably scan through your app drawer pretty quickly and figure out which programs you haven’t opened in the past month or two. If you see something you aren’t using — or something you really don’t need — touch and hold its icon and then look for the “Uninstall” option that appears around or above it. Remember, you can always reinstall it later if the need ever comes up.

And if an app can’t be uninstalled, it probably came preinstalled on your device and is baked into the system courtesy of your phone’s manufacturer and/or carrier. You should still be able to disable it, though: Either long-press it and then touch the “i” icon that appears or find and tap its title within the Apps section of your system settings. Then, look for the “Disable” command, tap it with gusto, and send the thing off to app hell — precisely where it belongs.

⭐ Bonus tip: Want a helping hand in identifying your unused apps? Grab Google’s standalone Files app (which notably may not be the same as the Files app that came preinstalled on your phone). Open it once, then set yourself a reminder to check back on it in a month. By then, Files will have built up enough data to be able to tell you which apps you aren’t actually ever using — and to give you a super-simple way to get rid of all of ’em with a few quick taps.

The Files app by Google makes it easy to identify and then uninstall apps you aren’t actively using.

JR Raphael, IDG

Just be sure to think through its recommendations carefully before uninstalling anything. Sometimes, the Files app will flag an app as “unused” if you haven’t explicitly opened it in a number of weeks — even if it’s something you actually do rely on as a background utility. But it’s a helpful starting point for this part of the process and a great way to get your tech-tinged tune-up going.

And speaking of stuff that runs in the background…

Step #2: Lock down resource hogs ⏱ Time required: 5 minutes

Maybe there’s an app you genuinely do use but that drags your phone down with over-the-top background activity — in other words, doing stuff you don’t need it to do while you aren’t actively looking at it. Facebook and Instagram are both notorious for this sort of obnoxious behavior, and they’re anything but the only offenders.

Lucky for us, though, even when an app is poorly designed in this way — with abusive background activity and no easy option to stop it — you can still reclaim control. Start by opening up the Battery section of your system settings and finding the app-by-app battery usage breakdown. (On some devices, you may have to tap the three-dot menu icon in the upper-right corner of the Battery settings to uncover that option.) This’ll work best if you do it toward the end of a day, when your phone has plenty of activity to analyze.

Tap any app with high battery usage and then see how much of its activity is happening in the background — while you aren’t actively using it. For any programs with high amounts of background activity, ask yourself: Is this app doing something in the background that actually matters? For instance, do you really need Facebook or any other social media and news tools to be refreshing their feeds while you aren’t looking at ’em? Probably not. But lots of apps like those do that by default and end up draining your device’s battery and monopolizing its horsepower as a result.

For any such items you come across, you’ve got two options: Look in the app’s own settings to see if there’s a way to turn off its background activity — or, provided your phone is running 2017’s Android 8 release or higher (which, by golly, it’d better be!), use Android’s own background restriction option within your phone’s Battery settings or Apps settings to shut it down at the system level.

Disabling an app’s background usage can cut down on unnecessary resource use and make your entire phone feel faster.

JR Raphael, IDG

Let’s check one more place, just to round things out: Head over to the Network & Internet section of your system settings (or the Connections section, if you’re on a Samsung phone) and tap the line labeled “Data usage” — or, if don’t see that line, tap either “Internet” or “Mobile network” and then tap the gear icon next to your carrier’s name followed by “App data usage.” (On some devices, you might see “App data usage” or possibly “Mobile data usage” right on that initial screen.)

However you get there, you should find a list of how much data different apps have been using as of late. Select any apps with high amounts and see how much of their data transferring is going on in the background. If an app is using a significant amount of background data for no apparent reason, take away its ability to do so by deactivating the “Background data” toggle on that same screen — which will in turn free up precious processing power and battery juice in addition to stopping the needless drain on your mobile data plan.

With both parts of this step, just be sure to use common sense and avoid disabling background permissions for any system-level tools — things like your Phone app or “Android OS” — as well as for any apps that genuinely need such capabilities in order to operate (like a messaging app, for instance, which wouldn’t be able to look for new incoming messages if it didn’t have background data and battery access).

Step #3: Nuke annoying notifications ⏱ Time required: 4 minutes

This next step is less about system performance and more about your own sanity and ability to get stuff done. Notifications are distractions, after all — and odds are, your phone’s giving you plenty of attention-demanding alerts that are ultimately slowing you down.

Think about all the notifications that show up on your Android device — and then think carefully about how many of ’em provide you with truly pertinent info that warrants the interruption. Do you need to know about every breaking news story the second it happens instead of finding that info when you actively seek it out on your own? What about social media mentions or incoming emails? Only you can decide what makes sense for you, but I’d be willing to wager you have at least a couple (and more likely a coupon dozen) types of notifications you’d be better off without.

If you think of any such examples, march into the Notifications section of your system settings and tap “App notifications.” You can then select any app you’ve got and either turn off all of its alerts, at the system level, or disable only certain types of alerts it’s able to generate — for instance, leaving on notifications for direct messages in LinkedIn but shutting off all the other types of interruptions that service loves to send your way.

You can also get even more nuanced and change the way certain types of alerts appear — maybe setting an app’s less important notifications to show up silently so you’ll see ’em eventually but won’t be bothered when they arrive.

⭐ Bonus tip: Provided your phone is running Android 9 or higher, there’s a quick ‘n’ easy way to find your worst notification offenders: Head into that same Notifications section of your system settings, tap “App notifications,” and then look for the “Most recent” bar or the “All” dropdown at the top of the screen.

Tap that, then tap the selector at the top of the screen and change it to “Most frequent” — and then, you’ll see an ordered list of exactly which apps are interrupting you the most. You can tap on any app’s name from that list to jump directly to its notification controls and tell it to kindly hush its virtual beak.

Your Android notification panel doesn’t have to be an inefficient, clutter-filled source of stress.

JR Raphael, IDG

???? And if you want even more notification-improving intelligence, check out my free Android Notification Power-Pack to explore six next-level notification enhancers that’ll make any Android device instantly more effective.

Step #4: Clear out your storage ⏱ Time required: 3 minutes

Android phones often have limited amounts of local storage — especially when you’re working with an old Android device or one that’s more on the midrange to lower-end side of the spectrum. So the next step in our speed-up will clear out the clutter and free up some of your device’s precious local space. That’ll give you more room for future downloads and app installs, of course, but it can also help your phone run a heck of a lot more smoothly.

The biggest storage-sucking culprit, not surprisingly, tends to be content from your camera — so if you aren’t already using Google Photos’ excellent cloud-syncing capability, head into the app’s settings and set that up now. Then, you can safely erase all the local copies of your photos and videos and still access them as if they were on your device. Plus, you can get to ’em from other phones or computers, too.

The rest is refreshingly easy: Remember that Google Files app we were talking about a minute ago? Open it up and look through the cards on its “Clean” screen. They’ll show you a bunch of smart suggestions for stuff you can clean up and delete, ranging from junk files and duplicates to already backed-up photos, videos, and other files you aren’t using and likely don’t need. You can review all of the suggestions and then click a button to sweep any of the associated files away without ever leaving the app.

You’ll have a smoother and quite possibly speedier Android experience with less stuff clogging up your storage.

JR Raphael, IDG

⭐ Bonus tip: If you’re using a Google-made Pixel phone, look for the “Smart Storage” option within the settings area of the Files app (which you can get to by tapping the three-line menu icon in the app’s upper-left corner and then selecting “Settings”). Activating that will allow your phone to automatically remove already-backed-up photos and videos whenever your storage is running low or after the files have been on your device for a certain amount of time — taking all the heavy lifting out of your hands and making the ongoing maintenance almost entirely effortless.

Step #5: Spruce up your home screen ⏱ Time required: 4 minutes

Almost done! Up next is a simple step that’s all about organization, speed, and the resulting efficiency that’ll bring you — all by getting your home screen tidied up and in optimal working order.

An organized home screen makes it faster and easier to get to the stuff you use the most — and realistically, for most of us, that’s a relatively small number of items. Remember: This isn’t iOS! You don’t have to treat your home screen as a generically cluttered grid of every single thing you’ve got installed.

So look at every item on your home screen and think carefully about how often you use it. If it isn’t something you open at least once daily or close to it, take it off. That way, your home screen will act as an efficient launching pad for your most essential apps, shortcuts, and widgets — the stuff you actually access on a regular basis — and then everything else can be pulled up quickly as needed via your scrolling alphabetical app drawer.

⭐ Bonus tip: If you really want to take your home screen efficiency to the next level, consider exploring some of the many excellent custom Android launchers that are out there and waiting. They replace your standard home screen environment with something much more customizable for your specific work flow and preferred methods of getting stuff accomplished, and they can make any phone feel meaningfully faster, more efficient, and also just more pleasant to use.

Step #6: Flip Android’s secret superspeed switch ⏱ Time required: 1 minute

Last but not least is one of the most powerful speed-boosters for any Android device, no matter how new or high-end it may (or may not) be.

It’s a secret switch that puts your phone into a little-known and deeply buried turbo mode of sorts. It’ll take you virtually no time to find and activate, and trust me: You’ll notice a definite difference the second you do.

I’ve got all the info you need in this quick ‘n’ simple guide.

And with that, my dearest darling, your speed-seeking Android adjustments are complete. Give yourself a hearty pat on the back and grab a well-deserved donut — and get ready for your phone to roll out of the shop and rev its engine all over the world.

Ready for even more advanced Android knowledge? Come check out my free Android Shortcut Supercourse next. You’ll learn tons of time-saving tricks for your freshly optimized phone!

Kategorie: Hacking & Security

OpenAI is losing money on its pricey ChatGPT Pro subscription

7 Leden, 2025 - 18:15

OpenAI CEO Sam Altman, in a post on X, says the AI ​​company is currently losing money on its ChatGPT Pro subscription. “People are using it much more than we expected,” he wrote.

The company introduced its ChatGPT Pro subscription in December. The subscription costs $200 a month and gives users access to an upgraded version of the o1 reasoning model, o1 pro mode, and has no user restrictions for tools such as the video generator Sora.

In another post, Altman wrote that he personally chose the price for ChatGPT Pro in the belief it would bring in money for the company. But the high costs associated with training the large language models (LLMs) generative AI tools require make profitability difficult for OpenAI to achieve.

Kategorie: Hacking & Security

Why Apple’s AI-driven reality distortion matters

7 Leden, 2025 - 17:35

Apple has been forced to admit what every company involved in artificial intelligence (AI) should also be forced to state — AI makes mistakes, just like people do. 

On the surface, it’s not a terribly big deal:

  • Apple’s AI badly mangled a handful of news headlines.
  • The BBC complained about the mangling.
  • Because it was a story about Apple, everyone discussed it.
  • Apple was eventually forced to answer the criticisms and come up with a plan of action to make things better in the future. 

What that plan means is that the company will update Apple Intelligence “in the coming weeks” with an update that will in some way clarify when a notification has been summarized by AI.

The idea behind this is that people reading those headlines will know that there could be a machine-generated error (as opposed to an error by humans) in the news they are perusing. The inference is, of course, that you should question everything you read to protect yourself against machine-generated error or human mistakes. 

Question everything: Human, or AI

The humans who generate news are up in arms, of course. They see the complaint as a cause celebre from which to make a stand against their own eventual replacement by machines. The UK National Union of Journalists, Reporters Without Borders and the head of Meta’s Oversight Board (if that board still exists by the end of the week) have all pointed to these erroneous headlines to suggest Apple’s AI isn’t yet up to the task. (Though even Apple’s critics point out that part of the problem is that even under human control, public trust in news has already sunk to record lows.)

Those critics also argue that telling users that a news headline has been generated by AI doesn’t go far enough. They argue that it means readers must confirm what they read. “It just transfers the responsibility to users, who — in an already confusing information landscape — will be expected to check if information is true or not,” Vincent Berthier, head of RSF’s technology and journalism desk, told the BBC

But is that really such a bad thing? Shouldn’t readers of human-generated news reports already be checking what they read?

French philosopher and media literacy theory thought leader Michel Foucault would argue that every reader of any news brand should run what they read through an effective framework of critical media analysis. He would urge readers to “criticize the workings of institutions that appear to be both neutral and independent.”

That includes Apple, of course, as well as the BBC — or even me.

Why this and not that?

The idea — and it really isn’t a complicated one — is that it is rare you should unquestioningly believe what you read, no matter who wrote it, human or machine.

What is written is one thing, why it is written is another. In this case, why has the BBC focused particularly on Apple’s error, rather than exploring the other errors that come with AI?

To some extent the story misses the biggest point: if AI isn’t yet ready to handle a task as relatively trivial as automatic news headline summaries, then this bodes badly for all the other things we’re being told AI should be used for. By inference, it means every AI system, from autonomous vehicles to public transit management or even machine intelligence supported health services can make mistakes. 

Knowing that machines makes errors might help people better prepare to handle those errors as they transpire. As AI becomes more widely deployed, it becomes very important to plan for what to do when things go wrong.

The relatively trivial Apple News headline story’s biggest take-away is that things will go wrong, so what are we going to do when that happens — particularly when the errors made are more serious than a headline.

Why mistakes happen

One more difference between human and machine is that it is not always possible to identify where AI errors originate. After all, in most cases, human error can be discussed and its reasons for existing understood.

In contrast, machine-driven errors take place in response to whatever algorithms are used to drive the AI, relationships and decision making processes that may not be at all transparent — the so-called “black box” problem machine intelligence practitioners have been concerned about for decades. At times, this could mean the logic prompting those errors isn’t visible, which means mistakes can easily recur.

It is not just Apple Intelligence that “hallucinates,” either. All the machines hallucinate, and it’s incredibly important to recognize this before too much discretionary power is given to them. It would also be useful to see major news corporations take a deeper look into the extent to which AI reflects the prejudices of those who own it, rather than trivializing this important matter around discussion of a single brand.

There is a danger, after all, that AI in news becomes a living example of centralized media ownership on steroids, weaving a mirror of the world that reflects a narrowing outlook.

We need tough scrutiny for AI

Given that AI is expected to have a profound impact on culture and society, it seems important to give its implementation serious scrutiny. At the very least, Apple’s proposed solution — to ensure humans can easily identify when AI has been used to decide a news headline — seems a relevant first step towards putting such scrutiny in place.

We should demand the same transparency wherever AI is applied — such as health insurance payment denials, for example. That’s as true for Apple (itself currently planning to extend Apple News into new markets) as it is for anyone else in the business of using AI to get things done.

At the end of the day, the story is not the headline. The story is why the headline was put there in the first place. At Apple. And at the BBC.

You can follow me on social media! Join me on BlueSky,  LinkedInMastodon, and MeWe

Kategorie: Hacking & Security

Google faces new labor board complaint over contractor union bargaining

7 Leden, 2025 - 13:12

The US National Labor Relations Board (NLRB) has filed a fresh complaint against Google, alleging that the company acts as the employer of certain contract workers and must negotiate with their union, Reuters reports.

The Board has said Google is a “joint employer” for roughly 50 San Francisco-based content creators hired through IT contractor Accenture Flex.

These workers, who joined the Alphabet Workers Union in 2023, should be considered under the tech giant’s purview, according to the agency, the report said.

An administrative judge will now hear the complaint, with the decision subject to review by the NLRB’s five-member panel.

If the Board confirms Google’s status as a joint employer for the Accenture Flex contractors, the tech giant would be compelled to engage in collective bargaining and could be held accountable for breaches of federal labor law.

NLRB is also looking into a separate complaint from October, which accuses Google and Accenture Flex of altering working conditions without consulting the union first, according to the report.

This follows the NLRB’s January 2024 ruling requiring Google to negotiate with employees at YouTube Music — an Alphabet subsidiary — hired through a different staffing firm. Google has appealed the decision, and a US federal court is scheduled to review the case later this month.

Google has faced growing labor challenges, marked by worker protests and layoffs. Last year, the company removed a $15-an-hour minimum wage for contractors and implemented changes aimed at sidestepping union negotiations.

Implications for the industry

Google has stated that it does not have sufficient control over contract workers to qualify as their joint employer, according to the report.

The outcome of the case could set a precedent for how contract workers are treated across the tech industry, where companies frequently rely on third-party staffing firms.

“Companies may need to rethink their mix of employment types and how they engage contract and gig workers,” said Sanchit Vir Gogia, chief analyst and CEO at Greyhound Research. “In a worst-case scenario, this work could be moved to locations where such regulations don’t exist. Alternatively, companies might face additional compliance requirements, costs, and audits if the NLRB wins against major corporations.”

Meanwhile, large corporations may need to adopt a more flexible stance on the issue, as the number of contract and gig workers is expected to grow, Gogia added.

A decision against Google could also energize unionization efforts within the tech sector, offering a roadmap for organizing workers in an industry that has traditionally resisted union activity. “The topic is also profoundly interlinked with the country’s political climate,” Gogia said. “If one were to consider the past stand that the Trump administration had on the subject, it is clear that the concept of joint employer may not see the light of day after all.”

Kategorie: Hacking & Security

Trump tariffs could raise laptop, tablet prices by 46%, cut sales by 68%

7 Leden, 2025 - 12:00

A new report from the Consumer Technology Association (CTA) indicates the tariffs President-elect Donald Trump has threatened to impose against foreign shipments of technology into the US could threaten the products consumers rely on.

Trump, who re-takes office on Jan. 20, recently threatened to impose significant tariffs on technology and other imports from Canada, Mexico, and China. On Nov. 25, for example, he unveiled plans to implement a 25% tariff on all goods from Canada and Mexico, using the measure to pressure the two nations to address illegal immigration and drug trafficking.

Additionally, he proposed a 10% tariff on Chinese imports, citing concerns over trade imbalances and unfair practices.

Specifically, levies on technology product and parts imports could reduce US consumer purchasing power by between $90 billion and $143 billion. That, in turn, could force laptop and tablet prices up by as much as 46% — and potentially cut laptop and tablet purchases by 68%, gaming consoles by 58%, and smartphones by 37%, according to the CTA report.

The prospect of new tariffs has raised concerns among economists and trading partners. Maurice Obstfeld, former chief economist at the International Monetary Fund, warned in an interview with MarketWatch that such measures could lead to the formation of hostile trading blocs and a potential global economic downturn.

Consumer Technology Assoociation

In response to the Trump threats, the Canadian government has indicated it would retaliate against any tariffs on Canadian goods; for its part, Mexico has said any such tariffs would not effectively address immigration issues.

“The incoming administration must address how tariffs impact American businesses and consumers,” said CTA Vice President of Trade Ed Brzytwa. “Retaliation from our trading partners raises costs, disrupts supply chains, and hurts the competitiveness of US industries. US trade policy should protect consumers and help American businesses succeed globally.”

Without tariffs in place, the CTA expects robust growth for the US consumer tech industry in 2025, projecting record retail revenue will rise 3.2% compared to 2024 to $537 billion this year. 

Stephen Minton, IDC vice president of data and analytics research, said the impact of tariffs on PC, tablet, and smartphone prices and sales will depend on tariff size, exemptions, timing, and the inclusion of PCs and components.

“So, it’s too early to get specific, but what we do know is that a large share of PCs are currently still manufactured in China — almost 90% of the global market — which makes PCs more exposed to some of the proposed tariffs than most other IT segments,” Minton said.

US vendors like Apple, HP, and Dell still manufacture most of their PCs in China, but some of those companies have begun shifting production to countries like Vietnam and Thailand, Minton noted. (Apple has also made a push to move manufacturing to India.)

Even so, any large new tariffs on imports from China would almost certainly lead to PC price increases, Minton said.

“This could force enterprises to purchase fewer PC upgrades in order to stay within their allocated 2025 budgets,” Minton said. “This would be especially true at the lower end of the market, where there’s very little margin for vendors to absorb the impact of any new tariffs. …It’s likely that any significant new tariffs would be passed on to all customers.”

Canalys

Additionally, in reaction to the possibility of tariffs, tech suppliers could stockpile inventory in early 2025 to avoid future price hikes, according to Greg Davis, an analyst with market research firm Canalys.

Commercial demand for PCs and tablets remained strong in late 2024, with 12% shipment growth in Q3. The Windows 11 refresh is ongoing — especially with the end of support for Windows 10 coming in October — and commercial strength is expected to persist into early 2025, according to Davis. Total PC shipments to the US are expected to rise 6% to just under 70 million units in 2024 followed by modest 2% growth in both 2025 and 2026.

Consumer purchases drove growth earlier this year, but the commercial market now leads US PC sales, according to Davis. Businesses large and small are upgrading to Windows 11 PCs more actively in the second half of the year.

Even so, macroeconomic conditions in the US are not expected to be as stable in the near-term as they have been over the last year or two, Davis said. “With reports of import tariffs seemingly on the horizon, the PC market will likely be impacted in a noticeable way,” he said.

Kategorie: Hacking & Security

Nvidia’s Project DIGITS puts AI supercomputing chips on the desktop

7 Leden, 2025 - 05:30

Nvidia has built its generative AI (genAI) business on delivering massive computing capacity to data centers where it can be used to train and refine large language models (LLMs). 

Now, the company is readying a diminutive desktop device called Project DIGITS, a “personal AI supercomputer” with a lightweight version of the Grace Blackwell platform found in its most powerful servers; it’s aimed at data scientists, researchers, and students who will be able to prototype, tune, and run large genAI models.

Nvidia CEO Jensen Huang unveiled Project DIGITS in a keynote speech on the eve of the CES 2025 electronics show in Las Vegas.

Project DIGITS is similar in size to the Windows 365 Link thin client Microsoft unveiled in November. Microsoft’s Link measures 120mm (4.72 inches) square and is 30mm (1.18 inches) high. 

Nvidia hasn’t given precise dimensions for Project DIGITS, with Allen Bourgoyne, director of product marketing, saying only that the square device will be about as wide as a coffee mug, including the handle, and about half as high. No international standard exists for mugs, but they are typically about 120mm across, including the handle, and around 90mm high, making Project DIGITS as wide as the Link but half as thick again. There the resemblance ends.

The philosophies behind the two devices are quite different: Where the Link pushes almost all the computing capacity to the cloud, Nvidia’s hardware is moving it down to the desktop. 

Microsoft’s Link has 8GB of RAM, no local data storage, and an unspecified Intel processor with no special AI capabilities: If you want to use Windows’ Copilot features they — like everything else — will run in the cloud. Link will sell for around $350 when it goes on sale in April.

One wall outlet, one petaflop

Project DIGITS, on the other hand, will cost upwards of $3,000 when it arrives in May. For that money, buyers will get 4TB of NVMe storage, 128GB of unified Low-Power DDR5X system memory, and a new GB10 Grace Blackwell Superchip; it comes with 20 ARM cores in the Grace CPU and a mix of CUDA cores, RT cores and fifth-generation tensor cores in the Blackwell GPU. 

Together, those cores offer up to 1 petaflop of AI processing capability — enough, said Bourgoyne, to work with a 200-billion-parameter model at “FP4” accuracy locally, with no need for the cloud. By connecting two Project DIGITS devices together via their built-in ConnectX networking chips, it’s possible to work with 400-billion-parameter models, he said.

The GB10 was co-developed with Mediatek, a company known for its power-efficient mobile chips. Compared to the GB200 processors used in data centers, an NV72 rack full of which can draw as much as 120kW, Project DIGITS is more power efficient. “So, you can plug it into a standard wall outlet,” Bourgoyne said. “It doesn’t require any additional power than what you have at your desktop.”

Project DIGITS won’t run Windows: Instead, it will run DGX OS, a version of Ubuntu Linux customized with additional drivers and tools for developing and running AI applications. That’s the same software that runs on Nvidia DGX systems in the data center, meaning models built and tested locally on Project DIGITS can be deployed straight to the cloud or data center, the company said. 

Other Nvidia AI tools the device can run include orchestration tools, frameworks, and models on the Nvidia Developer portal and in its NGC. That includes the NeMo framework for fine-tuning models and the RAPIDS libraries for data science.

Nvidia Blueprints and NIM microservices are available under lightweight licenses via its developer program for building agentic AI applications, with an AI Enterprise license needed only when they are moved to production environments, the company said. 

More generative AI on the desktop

Recognizing that you don’t need a GB10 processor to accelerate AI development on the desktop, Nvidia is also introducing a range of NIM microservices and AI Blueprints for building applications on PCs containing its Geforce RTX GPUs — what it calls RTX AI PCs.

Nvidia is introducing a range of AI foundation models — both its own and those of other developers — containerized as NIM microservices that can be downloaded and connected together. Using low-code and no-code tools such as AnythingLLM, ComfyUI, Langflow and LM Studio, developers will be able to build and deploy workflows using these NIM microservices, with its AI Blueprints providing preconfigured workflows for particular tasks such as converting between media formats. One of the newest Blueprints can convert PDFs to podcasts.

Kategorie: Hacking & Security

With o3 having reached AGI, OpenAI turns its sights toward superintelligence

7 Leden, 2025 - 04:52

OpenAI CEO Sam Altman has reinvigorated discussion of artificial general intelligence (AGI), boldly claiming that his company’s newest model has reached that milestone.

In an interview with Bloomberg, he noted that OpenAI’s o3, which was announced in December and is currently being safety tested, has passed the ARC-AGI challenge, the leading benchmark for AGI. Now, Altman said, the company is setting its sights on superintelligence, which is leaps and bounds beyond AGI, just as AGI is to AI. 

According to ARC-AGI, “OpenAI’s new o3 system — trained on the ARC-AGI-1 Public Training set — has scored a breakthrough 75.7% on the Semi-Private Evaluation set at our stated public leaderboard $10k compute limit. A high-compute (172x) o3 configuration scored 87.5%.”

Kategorie: Hacking & Security

Planes, trains and third-party risks — a tale of two IT-related shutdowns

6 Leden, 2025 - 17:56

Christmas Eve (and Christmas Day) are arguably the most important time-frame for transportation companies. So it was a big deal when an American Airlines system glitch forced the airline to ask the government for a full shutdown on Christmas Eve. And it was an even bigger deal the next day for Bane NOR, which runs the Norwegian rail system and had to shut down all trains in Norway.

Both involved IT issues and both were mostly — if not entirely — caused by third-party firms. Now, third-party risks are nothing new. But few CIOs truly internalize that one error from a vendor can shut down all enterprise operations. That’s a lot of trust to offer an outside company that typically undergoes minor due diligence, assuming it was subjected to any meaningful due diligence at all.

What happened with these Christmas nightmares? Let’s drill into each and note how the two transportation giants differed in their approach.

The more interesting of the two was the Norwegian train shutdown, which lasted 13 hours on Christmas Day, from roughly 8 a.m. until 9 p.m. The problem: trains couldn’t communicate with any traffic control centers, which meant they couldn’t operate safely. The cause: a bad firewall setting.

Let that sink in. Because systems today overwhelmingly run through the internet, firewalls can and will block anything. Until this incident, how many IT managers at Bane NOR realized a firewall setting could shut down every train everywhere?

That was a key reason for the long delay in getting the trains back online. When communications stop, managers think the communications gear is somehow failing.

“It took us a while before we could trace it to a firewall issue. It was not one of the obvious causes to look at,” Strachan Stine Smemo, the Bane external communications manager, said in an email to Computerworld. “It was tricky to find the problem.”

Bane’s team opted against changing any firewall settings and instead — as a temporary measure — switched communications to a different firewall. (They later changed the impacted components, Smemo said.)

Arild Nybrodahl, Bane’s information and communications technology director, said his team detected “system instability” on Christmas Eve, which is when “troubleshooting efforts were initiated.” Things didn’t get bad enough to shutdown operations until 8 a.m. the next day, he said. 

“The fault affected the railway’s closed mobile network (GSM-R) and other critical communication systems,” Nybrodahl said. “When any emergency calls and other communication between the train and the train conductor do not work, we cannot operate trains. We have located where the error lies in our own nationwide IT infrastructure and we are now working on a solution to correct the error. We have not yet corrected the root cause, but have taken measures so that the part of the network where the error was located is isolated from the rest of the infrastructure.”

Unlike American Airlines, Bane did not identify the relevant third-party and even praised that vendor’s efforts. Bane received “good help from our supplier,” Smemo said. 

American Airlines, however, not only identified the vendor at issue as DXC, but went out of its way to tell reporters that the problems it ran into were that vendor’s fault. This is known as throwing a partner under the bus.

It’s not clear precisely what happened between the two companies, as neither have discussed the particulars. But American made those comments shortly after the one-hour outage ended. That means emotions were at play, and someone at at the airline was very unhappy.

(DXC is likely unhappy, too, since its stock price has taken a hit.)

Though DXC has been a longtime supplier to American — the DXC website says “more than 20 years” — but it’s not precisely clear what role it had in the shutdown. The company has some role in the airline’s flight operations systems and has been working to modernize American’s systems, including moving legacy code to the cloud. 

The airline blamed a network hardware issue, without being specific, that forced the airline to ask the US Federal Aviation Administration for a nationwide group stop that ended up lasting about an hour.

According to a report on MSN , the incident delayed more than 900 flights affecting “around 900,000 passengers across 200 US airports, leaving many stranded and sleeping in terminals.”

Given that both of these incidents happened on major holidays, one obvious factor is that the companies had only skeleton crews on duty. Though it’s unlikely that holiday staffing caused either situation, it likely slowed down the responses.

One other wrinkle in the DXC situation: the company on Christmas Eve was already in the middle of an IT leadership change. CIO Kristie Grinnell had given notice about her move to a new job as CIO of TD SYNNEX. That was announced on Dec. 19; two weeks later DXC announced its new CIO would be Brad Novak

The problem with throwing a vendor partner under the bus — aside from the fact you haven’t done a full investigation or determined who’s at fault —is that it leaves important questions unanswered. Did this third-party firm have the appropriate skills and personnel to deliver what it was supposed to deliver? If not, then shouldn’t the fault lie with whoever hired that firm?

Let’s say the selection process was appropriate. The question then becomes, “Who was supposed to oversee that vendor?” And was the vendor given everything needed to do the job?

From the perspective of shareholders, the fault is more often going to lie with the people who overseeing and bringing in the outside firm. Unless the third-party company ignored instructions or engaged in bad behavior, most mishaps are going to be blamed on the enterprise.

Put bluntly, an enterprise that is quick to blame a contractor is likely trying to change the subject before its own failings are examined.

Also by Evan Schumann:

Kategorie: Hacking & Security

WordPress.org statement threatens possible shutdown for all of 2025

6 Leden, 2025 - 17:45

Editor’s note: On Jan. 3, 2025, WordPress.org staff announced that they were resuming the suspended services.

Automattic CEO Matt Mullenweg on Friday announced a shutdown of almost all services on WordPress.org, the open source project site that’s the home of the software, plugins, and the WordPress community, but was unclear on when the shutdown would end. 

This move sharply increases the uncertainty surrounding WordPress, IDC said.

“My sense is that many enterprise WordPress administrators will think twice about continuing to use the software under these circumstances,” said IDC Research Manager Michele Rosen. “It’s such a shame to watch a leader in the open source community repeatedly sabotage his own project.”

“At this point, I have real concerns about the impact of Matt Mullenweg’s words and actions on the overall image of open source software,” she added. “Even if he feels that WP Engine’s actions are unethical and the court is wrong, his actions are clearly having an impact on the WordPress ecosystem, including his own business. It seems self-destructive.”

To put this move into context, the shutdown only directly impacts WordPress.org, whereas most enterprises using Automattic’s WordPress are leveraging WordPress.com, the commercial hosting site. But given the ripple effects across all of WordPress, it is likely that enterprise users would also be impacted.

“The WordPress CMS is licensed under the GPL, so it is permanently available for free. However, a lot of WP’s value comes from themes and plugins,” Rosen said. “My understanding is that in some cases, the wordpress.org URL is hardcoded into WordPress, which can make it difficult or impossible to update your themes and plugins if they haven’t been added to the directory. It really depends on the particular website’s configuration.”

Hopes to restart ‘sometime in the new year’

The Mullenweg statement started off innocuously enough, saying that the WordPress.org team will take some time off for the holidays at the end of the year. But it turned unsettling when it raised the possibility that they may not reopen at all in 2025.

“In order to give myself and the many tired volunteers around WordPress.org a break for the holidays, we’re going to be pausing a few of the free services currently offered. New account registrations on WordPress.org — clarifying so press doesn’t confuse this: people can still make their own WordPress installs and accounts,” the statement said, adding that service pauses will also include “new plugin directory submissions, new plugin reviews, new theme directory submissions and new photo directory submissions. We’re going to leave things like localization and the forums open because these don’t require much moderation.”

But after mentioning his ongoing legal struggles with WP Engine, Mullenweg said “I hope to find the time, energy, and money to reopen all of this sometime in the new year. Right now, much of the time I would spend making WordPress better is being taken up defending against WP Engine’s legal attacks.”

Shutdown may hurt WordPress

Peter Zeitsev, the founder of Percona, an open source database software vendor, said that if the shutdown continues through all of 2025, “this will stifle the development of WordPress — no new user accounts, no new plugins published, etc. This could also spark the creation of an alternative hub to wordpress.org, one that would be truly operated in the interest of the [open source] community.”

Zeitsev said that he fears that there will be meaningful enterprise impacts if the shutdown continues. “Many WordPress users do not really interact with WordPress.org at all, but some commercial enterprise users can also rely on WordPress.org functionality, and they can be impacted,” he said.

Asked how this move will help WordPress.org, Zeitsev thinks it likely won’t, and that it might end up hurting them. 

“It might be that [Mullenweg] thinks there will be public/community pressure on WP Engine and the court to take his side, but I feel it will be seen as the opposite. Matt has been a wonderful steward of the WordPress community for so long, so governance and ownership of WordPress.org were not thought about,” Zeitsev said.

“Now things have changed, and commercial and community players in the WordPress space will be thinking about how much authority Matt personally has, and whether or not they can trust him to operate the ecosystem they invested so much in, in a way that reflects its interest.”

Kategorie: Hacking & Security

Apple Intelligence: Is AI an opportunity or a curse?

6 Leden, 2025 - 15:39

Does the rise of artificial intelligence represent more of an opportunity for the world — or a curse? Because for all the clamor about boosted productivity and enhanced human potential, there’s also the rising demand for energy, processor power, memory requirements and ever more bloat on the machine.

Just look at Apple Intelligence, which now demands almost twice as much data storage on your devices than was originally advertised.

I fear that’s the thin end of this cursed wedge. And it’s not as though storage is the only demand AI makes. 

AI is a greedy beast

Apple has been forced to roll out major hardware changes to support Apple Intelligence:

  • Memory: Apple has increased base memory across all of its machines. Macs, iPhone, and iPads all now ship with much more memory than before, boosting manufacturing costs.
  • Processor: Apple has really pushed the boat out on processors in its latest hardware. The company effectively raised everyone up an extra grade during the last 18 months as it primed its ecosystem for Apple Intelligence with new, faster, more energy-efficient processors.
  • Energy efficiency: Not only is Apple Silicon more energy efficient, but the company wants to give its devices more energy capacity. To do so, it is expected to shift to silicon-anode cells over the next 12 months. These hold around 15% more energy, which will be useful for the energy demands of edge AI.
  • Server infrastructure: Reflecting its realization that not every task can be accomplished on edge devices, Apple has now re-entered the server market, introducing its own take on secure server-based cloud computing services, Private Cloud Compute.

Apple isn’t alone in any of this, but its actions highlight the extent of the hundreds of billions being spent on the sector today — costs that extend into essential infrastructure resources such as water, rare resources, and energy supply. All of this costs enterprises money, focus, and time. The rewards? Even OpenAI, arguably the doyen of AI tech, is shedding cash faster than it makes it, even on its priciest $200-per-month ChatGPT Pro plan.

What need does the greed feed?

Right now, all we really seem to be experiencing is more targeted ads placement, email and website summaries, stupid pictures in messages, deep employment insecurity, rising energy costs, and an increasingly homogenized trade in optimized job resumes, press releases, and student exam papers. Oh, and don’t forget the fake video influencers hawking their wares on heavily AI-SEO’d social media.

We’re sold on potential, but may yet wind up with little more than a smarter search engine and a deeply intrusive invasion of privacy. Fantasia or dystopia? Even Elon Musk seems unsure, warning of the perils of AI at one point, only to introduce his own AI model later on. 

The hype is unavoidable at this week’s Consumer Electronic Show (CES), where AI is going to appear in some form across all the exhibit stands. Everyone and anyone who can link their product up to some form of AI service will do so. 

As is usual, some of the claims will turn out to be vaporware, while other combinations won’t really deliver much tangible benefit. To invent an example, do we really need an AI tool to order groceries toward personalized dinner plans it builds based on what it knows about our plans that week? Or do we just need a recipe book and a takeaway menu? 

What about the consequences of this kind of data being weaponized by AI? How is the information that AI gathers stored, who else can access it, and what control over it do we have? Do we really need dodgy surveillance-as-a-service firms to be able to identify information about us that they can then use to send convincingly authentic AI-targeted and developed phishing attacks to gain access to our digital lives? How well thought through are the solutions rapidly appearing on the table, and how much consideration has gone into weighing the potential consequences?

Behind the hype

Am I being unfair? 

I’m certain there are AI proponents who think the potential of what we are investing in far outweighs the risks. But there are always people prepared to make such claims. Right now, for most of us (even with Apple Intelligence), the hype, hoop-la, and costs haven’t yet delivered on the clamor. The rest of us watching tech bros snicker and smile on their shiny AI cavalcade remain to be convinced.

With that in mind, it seems a slow and steady approach to AI deployment could end up being the King’s Gambit in the game. Rather than chasing the evangelists, the industry should focus on putting solutions together that deliver genuine benefit, rather than simply looking good in headlines, (whoever writes them). We need to see true and tangible improvements to foster trust, and if the people behind them genuinely believe AI will drive future hardware sales, they’ll make sure their AI solutions do just that.

Or fail. 

You can follow me on social media! Join me on BlueSky,  LinkedInMastodon, and MeWe

Kategorie: Hacking & Security

AI revolution drives demand for specialized chips, reshaping global markets

6 Leden, 2025 - 12:00

Artificial Intelligence (AI) has rapidly transformed the chip industry since its mainstream arrival over the past two years, driving demand for specialized processors, accelerating design innovation, and reshaping global supply chains and markets.

The generative AI (genAI) revolution that began with OpenAI’s release of ChatGPT in late 2022 continues to push the limits of AI inference, large language models (LLMs) and semiconductor technologies. In short order, traditional CPUs, insufficient for AI’s parallel processing needs, have given way to specialized chips: GPUs, TPUs, NPUs, and AI accelerators.

That prompted companies such as Nvidia, AMD, and Intel to expand their portfolios to include AI-optimized products, with Nvidia leading in GPUs for AI training and inference. And because AI workloads prioritize throughput, energy efficiency, and scalability, the larger tech industry has seen massive investments in data centers, with AI-focused chips like NVIDIA’s H100 and AMD’s MI300 now powering the backbone of AI cloud computing.

At the same time, companies such as Amazon, Microsoft, and Google have developed custom chips (such as AWS Graviton and Google TPU) to reduce dependency on external suppliers and enhance AI performance.

In particular, the AI revolution propelled has propelled growth at Nvidia, making it as a dominant force in the data center marketplace. Once focused on producing chips for gaming systems, the company’s AI-driven hardware and software now outpaces those efforts, which has led to remarkable financial gains. The company’s market capitalization topped $1 trillion in May 2023 — and passed $3.3 trillion in June 2024, making it the world’s most valuable company at that time.

The AI-chip industry, however, is about to change dramatically. Over the past several years, semiconductor developers and manufacturers have focused on supplying the data center needs of hyperscale cloud service providers such Amazon Web Services, Google Cloud Platform and Microsoft Azure; organizations have relied heavily on those industry stalwarts for their internal AI development.

There’s now a shift toward smaller AI models that only use internal corporate data, allowing for more secure and customizable genAI applications and AI agents. At the same time, Edge AI is taking hold, because it allows AI processing to happen on devices (including PCs, smartphones, vehicles and IoT devices), reducing reliance on cloud infrastructure and spurring demand for efficient, low-power chips.

“The challenge is if you’re going to bring AI to the masses, you’re going to have to change the way you architect your solution; I think this is where Nvidia will be challenged because you can’t use a big, complex GPU to address endpoints,” said Mario Morales, a group vice president at research firm IDC. “So, there’s going to be an opportunity for new companies to come in — companies like Qualcomm, ST Micro, Renesas, Ambarella and all these companies that have a lot of the technology, but now it’ll be about how to use it.

“This is where the next frontier is for AI – the edge,” Morales said.

Turbulence in the market for some chip makers

Though global semiconductor chip sales declined in 2023 by about 11%, dropping from the previous year’s record of $574.1 billion to around $534 billion, that downturn did not last. Sales are expected to increase by 22% in 2025, according to Morales, driven by AI adoption and a stabilization in PC and smartphone sales.

“If you’re making memory or making an AI accelerator, like Nvidia, Broadcom, AMD or even Marvel now, you’re doing very well,” Morales said. “But if you’re a semiconductor company like an ST Micro, Infinium, Renesas or Texas Instruments, you’ve been hit hard by excess inventory and a macroeconomy that’s been uncertain for industrial and automobile sectors. Those two markets last year outperformed, but this year they were hit very hard.”

Most LLMs used today rely on public data, but more than 80% of the world’s data is held by enterprises that won’t share it with platforms like OpenAI or Anthropic, according to Morales. That trend benefits processor companies, especially Nvidia, Qualcomm, and AMD. Highly specialized System on a Chip (SoC) technology with lower price points and more energy efficiency will begin to dominate the market as organizations bring the tech in-house.

“I think it’s definitely going to change the dynamics in the market,” Morales said. “That’s why you’re seeing a lot of companies aligning themselves to address the edge and end points with their technology. I think that’s the next wave of growth you’re going to see along with the enterprise; the enterprise is adopting their own data center approach.”

Intel will continue to find a safe haven for its processors in PCs, and its decision to outsource manufacturing to TSMC has kept it competitive with rival AMD. But Intel is likely to struggle to keep pace with other chip makers in emerging markets.

“Outside of that, if you look at their data center business, it’s still losing share to AMD and they have no answer for Nvidia,” Morales said.

While Intel’s latest line of x86 and Gaudi AI accelerators are designed to compete with Nvidia’s H100 and Blackwell GPUs, Morales sees them more as a “stop gap” effort —not what the market is seeking.

“I do believe on the client side there’s an opportunity for Intel to take advantage of a replacement cycle with AI working its way into PCs,” he said. “They just received an endorsement from Microsoft for Copilot, so that gives their x86 line an opening; that’s where Intel can continue to fight until they recover from their transformation and all the changes that have happened at the company.”

To stay relevant in modern data centers — where Nvidia’s chips are driving growth — Intel and AMD will need to invest in GPUs, according to Andrew Chang, technology director at S&P Global Ratings.

“While CPUs remain essential, Nvidia dominates the AI chip market, leaving AMD and Intel struggling to compete,” Chang said. “AMD aims for $5 billion in AI chip sales by 2025, while Intel’s AI efforts, centered on its Gaudi platform, are minimal. Both companies will continue investing in GPUs and AI accelerators, showing some incremental revenue growth, but their share of the data center market will likely keep declining.”

Politics, the CHIPS Act and what happens after Jan. 20

Geopolitical and economic factors such as export restrictions, supply chain disruptions, and government policies, could also reshape the chip industry. President-elect Donald J. Trump, who takes office Jan. 20, has signaled he plans to impose heavy tariffs on chip imports.

The CHIPS and Science Act is also promising billions of dollars to semiconductor developers and manufacturers who locate operations in the US. Under the Act, $39 billion in funding has been earmarked for several companies, including TSMC, Intel, Samsung and Micron — all of whom have developed plans for, or are already building, new fabrication or research facilities.

But in order for tax dollars to be divvied out, each company must meet specific milestones; until that time the monies remain unspent. While the promise of billions of dollars in incentives are unquestionably helping reshore US chip production, Morales pointed to the CHIPS Act’s 25% tax break as a greater benefit.

“Even a company like Intel…is getting about $50 billion dollars [in tax breaks], which is unheard of. That’s where the winning payouts are,” he said.

Though Trump has signaled that government funding to encourage reshoring is the wrong tactic, industry experts do not believe the CHIPS Act will be drastically cut when he regains office. “We expect modest revisions to the CHIPS Act, but not something drastic as cutting funding yet to be dispersed,” Morales said. “The CHIPS Act received bipartisan support and any attempt to revise this would face pushback from states that stand to benefit, such as Arizona and Ohio.”

Though high-end processors to power energy-sucking cloud data centers have dominated the market to date, energy-efficient AI processors for edge devices will likely continue to gain traction.

“Think about an AI PC this year or a smartphone that incorporates AI as well, or even a wearable device that has a smaller, more well-tuned model that can leverage AI inferencing,” Morales said. “This is where we’re going next, and I think it’s going to be very big over the coming years.

“And, I think AI inferencing, as a percentage of the companies, will be as big if not bigger than what we’ve seen in the data center, so far,” he added.

From LLMs to SLMs and edge devices

Enterprises and other organizations are also shifting their focus from single AI models to multimodal AI, or LLMs capable of processing and integrating multiple types of data or “modalities,” such as text, images, audio, video, and sensory input. The input from diverse resources creates a more comprehensive understanding of that data and enhances performance across tasks.

Over 80% of organizations expect their AI workflows to increase in the next two years, while about two-thirds expect pressure to upgrade IT infrastructure, according to a report by S&P Global.

Sudeep Kesh, chief innovation officer at S&P Global Ratings, noted that AI is evolving towards smaller, task-specific models, but larger, general-purpose models will still be essential. “Both types will coexist, creating opportunities in each space,” he said.

A key challenge will be developing computationally and energy-efficient models, which will influence chip design and implementation. Chip makers will also need to address scalability, interoperability, and system integration — all of which are expected to drive technological advances across industries, improve autonomous systems, and enable future developments like edge AI, Kesh said.

In particular, as companies move away from cloud-based LLMs and embrace smaller language models that can be deployed on edge devices and endpoints, the industry will see increased interest in AI inferencing.

“It’s an environment where it’s feast or famine for the industry,” IDC’s Morales said. “What’s in store for the coming year? I think the growth we’ve seen in the data center been phenomenal and it will continue into 2025. What I’m excited about is enterprises are beginning to look at prioritizing IT spending dollars in AI, and that will break a second wave of demand for processors.”

Kategorie: Hacking & Security