In this dynamic digital economy, the more data-savvy your workforce is, the stronger your business outcomes will be. But, in Australia, data literacy remains the weakest link in the data value chain.
Organisations are generating vast amounts of data but are missing the mark when it comes to equipping employees with the skills they need to harness data-driven insights in their roles.
That’s why data literacy skills have become key areas of focus for business leaders looking for a competitive advantage in every facet of their organisation — from operations to sales to recruiting and retention.
A recent survey conducted by Forrester Consulting and commissioned by Tableau found that 71 per cent of Australian business decision makers agree that every employee across all departments should have at least basic data skills. Yet only 33 per cent of the workers surveyed said their organisation offered them data training — representing a significant divide in expectation and execution, with Aussie employees emerging as the most dissatisfied when compared to countries like Singapore and Japan.
In Australia, the impact of a competitive job market and talent shortage is widely felt. Recent ABS data revealed there are almost half a million job vacancies in Australia as of May 2022, more than double the vacancies from February this year.
According to the survey, investment in skills training could play a key role in driving employee retention, with 90 per cent of Australian employees more likely to stay with a company that invests in training.
People are your strongest asset
The most powerful asset in creating business value from data is people. Employees are the ones who need to be able to harness data-driven insights to solve problems, streamline processes, innovate and ultimately make better decisions.
The ‘why’ for investing in data literacy is clear. According to the Forrester survey, 70 per cent of Australian employees are expected to use data heavily in their job by 2025, which has almost doubled since 2018 (38 per cent).
Businesses need to agree on the level of data proficiency required for different job types. For example, a sales representative doesn’t need the same level of knowledge as a data scientist, but both should be able to use data meaningfully in their roles. So training and investing in data literacy doesn’t mean having to train every staff member to become a data scientist — it means empowering them to succeed in their individual roles.
But when it comes to the ‘how’, where do businesses start?
It takes an ecosystem to build a data-literate workforce
Data literacy as a shift in mindset can’t be an afterthought. Leaders have a pivotal role to play in instilling the day-to-day discipline to use data.
While there’s no one-size-fits-all approach, here are a few ways to get started:
- What does training look like? – Looking at what the business is aiming to achieve through training will help assess what curriculum the business requires and identify whether a partner is needed within the ecosystem to support driving training initiatives. Lean into people analytics solutions to recruit and train more efficiently.
- Spark curiosity – Showcasing the real-life impact of data within the business can help incentivise learning for employees who can see how data can inform decisions and change outcomes.
- Gamify data training – If employee engagement is a barrier to implementing a training program, the ability to understand data as a second language through a gamification program can play a vital role in creating engaging learning programs.
- Build a safe space for collaboration – Consider creating internal communities to support employee learning. Implementing formal training is a great first step, but moving beyond that is important. Businesses can benefit from creating a space where employees can collaborate, share and learn from each other beyond formal training.
Investing in data skills is worth it
Investing in data skills has certainly been paying off for many organisations. Some of the noticeable benefits have been greater innovation, better customer experiences, smarter decision making, lower costs and higher revenues.
For example, the leading Australian-owned toll road operator and developer Transurban, needed to create a strong data culture to ensure employees were fully equipped with the right skills and confident about making data-driven decisions. With visual analytics, almost half of all Transurban employees have developed the data skills to get greater insights into everything from what’s happening on the roads to accident hotspots and customer behaviour.
The company’s data culture has been built through investment in the implementation of regular training sessions, external support from a partner and an internal centre of excellence.
When it comes to investing in data skills, the question is no longer about who’s responsible for training employees — it’s about how we accelerate the path to becoming a data-driven organisation. The last two years have made it crystal clear that data-driven insights are essential for organisations to make fast, informed decisions — transforming insights into action.
Businesses need to remember that the most important investment they make is in people. And ultimately, unlocking the true value of data is a team sport. Having team members that are skilled in data and analytics, especially as the volume of data continues to grow exponentially, is what will position organisations for success in an increasingly digital world.
Keep up to date with our stories on LinkedIn, Twitter, Facebook and Instagram.
Amazon won’t have to pay hundreds of millions in back taxes after winning EU case
LONDON (AP) — Amazon won’t have to pay about 250 million euros ($273 million) in back taxes after European Union judges ruled in favor of the U.S. e-commerce giant Thursday, dealing a defeat to the 27-nation bloc in its efforts to tackle corporate tax avoidance.
The ruling by the EU’s top court is final, ending the long-running legal battle over tax arrangements between Amazon and Luxembourg’s government and marking a further setback for a crackdown by antitrust chief Margrethe Vestager.
The Court of Justice backed a 2021 decision by judges in a lower court who sided with Amazon, saying the European Commission, the EU’s executive branch, had not proved its case that Amazon received illegal state support.
“The Court of Justice confirms that the Commission has not established that the tax ruling given to Amazon by Luxembourg was a State aid that was incompatible with the internal market” of the EU, the court said in a press release.
Amazon welcomed the ruling, saying it confirms that the company “followed all applicable laws and received no special treatment.”
“We look forward to continuing to focus on delivering for our customers across Europe,” the company said in a statement.
The commission said it “will carefully study the judgment and assess its implications.”
The case dates back to 2017, when Vestager charged Amazon with unfairly profiting from special low tax conditions since 2003 in tiny Luxembourg, where its European headquarters are based. As a result, almost three-quarters of Amazon’s profits in the EU were not taxed, she said.
The EU has taken aim at deals between individual countries and companies used to lure foreign multinationals in search of a place to establish their EU headquarters. The practice led to EU states competing with each other and multinationals playing them off one another.
Tesla autopilot recalls: 2 million vehicles need to have their defective systems fixed
DETROIT (AP) — Tesla is recalling nearly all vehicles sold in the U.S., more than 2 million, to update software and fix a defective system that’s supposed to ensure drivers are paying attention when using Autopilot.
Documents posted Wednesday by U.S. safety regulators say the update will increase warnings and alerts to drivers and even limit the areas where basic versions of Autopilot can operate.
The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that happened while the Autopilot partially automated driving system was in use. Some were deadly.
The agency says its investigation found Autopilot’s method of making sure that drivers are paying attention can be inadequate and can lead to “foreseeable misuse of the system.”
The added controls and alerts will “further encourage the driver to adhere to their continuous driving responsibility,” the documents said.
But safety experts said that, while the recall is a good step, it still makes the driver responsible and doesn’t fix the underlying problem that Tesla’s automated systems have with spotting and stopping for obstacles in their path.
The recall covers models Y, S, 3 and X produced between Oct. 5, 2012, and Dec. 7 of this year. The update was to be sent to certain affected vehicles on Tuesday, with the rest getting it later.
Shares of Tesla slid more than 3% in earlier trading Wednesday but recovered amid a broad stock market rally to end the day up 1%.
The attempt to address the flaws in Autopilot seemed like a case of too little, too late to Dillon Angulo, who was seriously injured in 2019 crash involving a Tesla that was using the technology along a rural stretch of Florida highway where the software isn’t supposed to be deployed.
“This technology is not safe, we have to get it off the road,” said Angulo, who is suing Tesla as he recovers from injuries that included brain trauma and broken bones. “The government has to do something about it. We can’t be experimenting like this.”
Autopilot includes features called Autosteer and Traffic Aware Cruise Control, with Autosteer intended for use on limited access freeways when it’s not operating with a more sophisticated feature called Autosteer on City Streets.
The software update will limit where Autosteer can be used. “If the driver attempts to engage Autosteer when conditions are not met for engagement, the feature will alert the driver it is unavailable through visual and audible alerts, and Autosteer will not engage,” the recall documents said.
Depending on a Tesla’s hardware, the added controls include “increasing prominence” of visual alerts, simplifying how Autosteer is turned on and off, and additional checks on whether Autosteer is being used outside of controlled access roads and when approaching traffic control devices. A driver could be suspended from using Autosteer if they repeatedly fail “to demonstrate continuous and sustained driving responsibility,” the documents say.
According to recall documents, agency investigators met with Tesla starting in October to explain “tentative conclusions” about the fixing the monitoring system. Tesla did not concur with NHTSA’s analysis but agreed to the recall on Dec. 5 in an effort to resolve the investigation.
Auto safety advocates for years have been calling for stronger regulation of the driver monitoring system, which mainly detects whether a driver’s hands are on the steering wheel. They have called for cameras to make sure a driver is paying attention, which are used by other automakers with similar systems.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies autonomous vehicle safety, called the software update a compromise that doesn’t address a lack of night vision cameras to watch drivers’ eyes, as well as Teslas failing to spot and stop for obstacles.
“The compromise is disappointing because it does not fix the problem that the older cars do not have adequate hardware for driver monitoring,” Koopman said.
Koopman and Michael Brooks, executive director of the nonprofit Center for Auto Safety, contend that crashing into emergency vehicles is a safety defect that isn’t addressed. “It’s not digging at the root of what the investigation is looking at,” Brooks said. “It’s not answering the question of why are Teslas on Autopilot not detecting and responding to emergency activity?”
Koopman said NHTSA apparently decided that the software change was the most it could get from the company, “and the benefits of doing this now outweigh the costs of spending another year wrangling with Tesla.”
In its statement Wednesday, NHTSA said the investigation remains open “as we monitor the efficacy of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety.”
Autopilot can steer, accelerate and brake automatically in its lane, but is a driver-assist system and cannot drive itself, despite its name. Independent tests have found that the monitoring system is easy to fool, so much that drivers have been caught while driving drunk or even sitting in the back seat.
In its defect report filed with the safety agency, Tesla said Autopilot’s controls “may not be sufficient to prevent driver misuse.”
A message was left early Wednesday seeking further comment from the Austin, Texas, company.
Tesla says on its website that Autopilot and a more sophisticated Full Self Driving system are meant to help drivers who have to be ready to intervene at all times. Full Self Driving is being tested by Tesla owners on public roads.
In a statement posted Monday on X, formerly Twitter, Tesla said safety is stronger when Autopilot is engaged.
NHTSA has dispatched investigators to 35 Tesla crashes since 2016 in which the agency suspects the vehicles were running on an automated system. At least 17 people have been killed.
The investigations are part of a larger probe by the NHTSA into multiple instances of Teslas using Autopilot crashing into emergency vehicles. NHTSA has become more aggressive in pursuing safety problems with Teslas, including a recall of Full Self Driving software.
In May, Transportation Secretary Pete Buttigieg, whose department includes NHTSA, said Tesla shouldn’t be calling the system Autopilot because it can’t drive itself.
AP Technology Writer Michael Liedtke contributed to this story.
Why Was Sam Altman Fired? Possible Ties to China D2 (Double Dragon) Data from Hackers
Theories are going around the internet why Sam Altman was fired. On an insider tech forum (Blind) – one person claims to know by third-hand account and how this news will trickle into the media over the next couple of weeks.
It’s said OpenAI had been using data from D2 to train its AI models, which includes GPT-4. This data was obtained through a hidden business contract with a D2 shell company called Whitefly, which was based in Singapore. This D2 group has the largest and biggest crawling/indexing/scanning capacity in the world 10x more than Alphabet Inc (Google), hence the deal so Open AI could get their hands on vast quantities of data for training after exhausting their other options.
The Chinese government became aware of this arrangement and raised concerns with the Biden administration. As a result, the NSA launched an investigation, which confirmed that OpenAI had been using data from D2. Satya Nadella, the CEO of Microsoft, which is a major investor in OpenAI, was informed of the findings and ordered Altman’s removal.
There was also suggestion that Altman refused to disclose this information to the OpenAI board. This lack of candor ultimately led to his dismissal and is what the board publicly alluded to when they said “not consistently candid in his communications with the board.”
To summarize what happened with Sam Altman’s firing:
1. Sam Altman was removed from OpenAI due to his ties to a Chinese cyber army group.
2.OpenAI had been using data from D2 to train its AI models.
3. The Chinese government raised concerns about this arrangement with the Biden administration.
4. The NSA launched an investigation, which confirmed OpenAI’s use of D2 data.
5. Satya Nadella ordered Altman’s removal after being informed of the findings.
6. Altman refused to disclose this information to the OpenAI board.
We’ll see in the next couple of weeks if this story holds up or not.