MacInTouch Amazon link...
Apple, Security, Products
These latter two cards are so heavy, they slip out of my wallet. Friction is not enough to hold them in. Also, the metallic sound when dropping it on the table is not exactly pleasing. What exactly is the point of using titanium?
Likely designed to do precisely that - causing heads to turn. Free advertising for them, provided by (an unwilling) you.

Does anybody else feel the Apple card is inconveniently heavy? Most credits cards weigh around 5 g. The Amazon Prime Card is more than 12 g,, only to be topped by the Apple card at almost 15 g! These latter two cards are so heavy, they slip out of my wallet. Friction is not enough to hold them in. Also, the metallic sound when dropping it on the table is not exactly pleasing. What exactly is the point of using titanium?
This is generation 1 of the Apple Card. Future releases will be thinner and thinner until it is a sliver of titanium foil.

Ric Ford

Malcolm Owen said:
Apple Card users having problems with making payments
A number of Apple Card customers are having problems managing their credit card, with reports of users struggling to make payments to the credit facility through the Wallet app, with the distinct possibility of some users accidentally making multiple payments against their account balance.

... Users affected by the issue are finding they are waiting for considerable lengths of time before being able to get in contact with a Goldman Sachs representative of the service via Apple Business Chat or over the phone. So far affected customers are being informed of the existence of an "issue with their available credit or balance displayed within Wallet," and that it would be resolved shortly.

Some data about the scale and financial resources required for the Apple Card have been released. From the launch in late August to September 30 (a little over a month!), $10 billion in credit lines was issued. Within those credit lines, $736 million is loan balances.

For context, a credit union with a national footprint, PedFed, has $25 billion in assets and 1.8 million customers.

My biggest issue with Apple Card is that I cannot set up payments in my online banking app... you can only make payments via a bank account attached to your Apple Wallet, which scares me.

RonL, actually you can make payments from your bank without the Apple Card pulling the payment from the bank. The Apple Card statement has the mailing address, and while it doesn't say you can make payments to that address, it does work. Just be certain to include your Apple Card account number along with the statement address when you set up payments with your bank.

Our first payment from the bank took 11 days to appear, but the next month's payment took about a day. I suspect the first payment was a paper check from the bank, and once that worked, they switched it to an electronic payment method.

Ric Ford

Currently in the news:
Reuters said:
Goldman faces probe after entrepreneur slams Apple Card algorithm in tweets
... The Department of Financial Services “will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex,” a department spokeswoman told Reuters in a statement.

“Any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class violates New York law.”

Ric Ford

And Woz points out the critical issues being overlooked:
Bloomberg said:
Apple Co-Founder Says Goldman’s Apple Card Algo Discriminates
... Now another high-profile user of the Apple Card -- Apple co-founder Steve Wozniak -- is calling for the government to get involved, citing excessive corporate reliance on mysterious technology.

“These sorts of unfairnesses bother me and go against the principle of truth. We don’t have transparency on how these companies set these things up and operate,” Wozniak said in an interview on Sunday. “Our government isn’t strong enough on the issues of regulation. Consumers can only be represented by the government because the big corporations only represent themselves.”

Wozniak said he can borrow 10 times as much as his wife on their Apple Cards even though they share bank and other credit card accounts, and that other lenders treat them equally.

“Algos obviously have flaws,” Wozniak said. “A huge number of people would say, ‘We love our technology but we are no longer in control.’ I think that’s the case.”

Currently in the news:
And Woz points out the critical issues being overlooked:
... Now another high-profile user of the Apple Card -- Apple co-founder Steve Wozniak -- is calling for the government to get involved, citing excessive corporate reliance on mysterious technology....​
We will need to see what the results are of the investigation. There may be factors that haven't been reported. For instance, even though you don't provide income data on the application, that doesn't mean that the bank isn't going to get those numbers from other sources. And there could simply be a bug in the code. "Never attribute to malice what can adequately be explained by stupidity."

Lender bias is a long-standing problem in the US, so I think it's good this inequity has been exposed. I'm especially interested in seeing how Apple and Marcus Bank respond. Will they make excuses or focus on remedies? Or remain silent, as they have been since the story broke? Apple's culture of secrecy will not serve them well here, especially if Apple user data is involved.

Ric Ford

We will need to see what the results are of the investigation. There may be factors that haven't been reported. For instance, even though you don't provide income data on the application, that doesn't mean that the bank isn't going to get those numbers from other sources. And there could simply be a bug in the code. "Never attribute to malice what can adequately be explained by stupidity."
While there may be a "bug in the code", there are enormous issues signified in this small example that are independent of any coding bugs, as expressed by Woz. In fact, the issue of "black boxes" is rapidly becoming an existential issue, as A.I. becomes ever more pervasive and powerful, growing far beyond the ability of any individual or team to understand or control it.

I vividly remember, decades ago, the rare privilige of being invited to lunch with the brilliant creator of a critical proprietary system at a giant computer company and learning to my astonishment that he, its inventor, could no longer understand completely how it worked - it had grown too complex. And that system was far simpler than ordinary personal computer systems of today, which are orders of magnitude simpler than the computer systems of corporations such as Apple, Google, and Facebook, which are infinitely simpler and more controllable than future A.I. technologies developing on top of trillions and trillions of very personal data records being constantly collected today.

The danger, of course, as science fiction tried to teach us years ago, is that these "black boxes" can (and will) produce unpredictable, "malicious", and deadly actions through their "stupidity." Here's a recent example:
NTSB said:
Automated Test Vehicle Subject of Board Meeting
... A 49-year-old woman died when the test vehicle struck her as she was walking a bicycle midblock across Mill Avenue in Tempe, Arizona, on March 18, 2018. The test vehicle, a 2017 Volvo XC90 sport utility vehicle modified with an Uber Advanced Technologies Group developmental automated driving system, was occupied by one operator who was not injured in the crash. The vehicle was controlled by the Uber Advanced Technologies Group developmental automated driving system as it encountered the pedestrian.
Previously released information about the NTSB’s investigation of the crash is available online at
Here's another:
NBC said:
Tesla Sued by Family of Apple Engineer Who Died in Model X Autopilot Crash in Silicon Valley
... Lawyers for the family of 38-year-old Walter Huang told reporters at a press conference Wednesday that Huang was on his way to drop off his children at school on March 23, 2018 when he died from injuries he suffered when the Autopilot of his 2017 Tesla Model X drove his SUV into the unprotected edge of a concrete highway median that was missing its crash guard.
“We want to ensure the technology behind semi-autonomous cars is safe before it is released on the roads, and its risks are not withheld or misrepresented to the public," the family's attorneys said.
#AI #complexity #blackboxes

Ric Ford

We will need to see what the results are of the investigation....
Here's an update (see the included link):
CNBC said:
Regulator probing Goldman over Apple Card: Gender bias must be rooted out of process
Companies that deploy biased algorithms — even unknowingly — are still responsible for potential discriminatory outcomes, the Wall Street regulator who is probing Goldman Sachs' Apple Card told CNBC on Monday.

"Algorithms don't get immunity from discrimination," said Linda Lacewell, superintendent of New York's Department of Financial Services, which is investigating claims that Goldman Sachs' Apple Card discriminated against women when determining credit limits.

... In a statement released Sunday, Goldman said it does not consider gender in credit decisions and evaluates all applications independently. Goldman also said it looking into ways for family members to share a single Apple Card account.

Lacewell said that her agency, which regulates banks in New York, has been in contact with representatives from Goldman and could sit down with them as soon as Tuesday.

When asked whether DFS was investigating both Goldman and Apple, Lacewell responded that it was looking into "the practice."

"Goldman is the bank that stands behind the Apple Card," she continued. "We actually license Goldman ... We've asked the company to begin explaining what the algorithm is."
Here's a different example of AI black box problems regarding gender bias (via the link above):
Reuters said:
Amazon scraps secret AI recruiting tool that showed bias against women
... In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

Amazon edited the programs to make them neutral to these particular terms. But that was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory, the people said.

Apple Card application decisions may not even rely much on AI. A scoring algorithm could simply take trust/don't trust inputs from a number of outside data providers and generate credit card decisions.

While retailers have shared information about serial refunders and returners among themselves for decades, that and a lot more behavioral data has become available to financial services companies. Now seemingly innocent actions, such as complaining about an Airbnb rental or retuning a purchase to, has the potential to affect the size of a credit line or the terms of a mortgage.

For example, say you buy and return a lot of Ethernet cables, routers, and switches on Amazon because you install networks for a living. Or you subscribe to a monthly "curated clothing" box where you pay for the clothes up front and return the garments you don't want for a refund. Both of these could look suspicious to a behavioral data aggregator and generate a negative personal rating. And just imagine what a heavy user of Coinbase might look like to a credit card issuer!

Here's a recent article for anybody interested in knowing more:
NY Times said:
I Got Access to My Secret Consumer Score. Now You Can Get Yours, Too
Little-known companies are amassing your data — like food orders and Airbnb messages — and selling the analysis to clients. Here’s how to get a copy of what they have on you.
#privacy #security

Ric Ford

And some more notes from The Verge:
Dieter Bohn said:
Apple owns every mistake Goldman Sachs makes with its card
... But before I comment on any of this, I want to point you to the blog post by Jamie Heinemeier Hansson about the whole ordeal. She is the person who was denied the full measure of what her credit limit should have been because of Apple and Goldman Sachs’ black-box algorithm. You should read it in full, even though I’m about to quote a bit below, because it is important, level-headed, and blisteringly accurate.

... Of course, you wouldn’t expect Goldman Sachs to apologize, because that could be used against it in the upcoming lawsuit. You also wouldn’t expect Goldman Sachs to apologize because it’s Goldman Effing Sachs, one of the architects of the housing crisis a decade ago that had to pay a $5 billion settlement and admit to a series of facts about how it misled investors.

You know, the company that Apple partnered with to launch the Apple Card.

Apple is trying to have all the benefits of a consumer and privacy-friendly credit card without any of the hassles that come along with it. Take a look at Apple’s promotional page for Apple Card: right now it’s led by the tagline “Created by Apple, not a bank.” I’m sure it was, in the same way that Apple’s products are “Designed in California” but assembled in China — though at least then Apple oversees the manufacturing and tries to guarantee a basic level of human rights.

If this were a problem unique to Apple, one might be worried. But it could well be a reflection of the credit-worthiness of folk as measured by the usual suspects.

The nice thing about "AI" is that you simply don't need to tell it to take specific notice of race, sex, gender, height, etc. All of those are absorbed into the data/outcome pattern-matching. So the results of Apple's analyses arrive at the results you'd expect from ordinary credit-rating reports.

Further, in my experience, you can call up your credit card company and say, "Hey, I need a bigger credit limit - look I had a limit of $nnnnnn on my old XYZ Bank card and it all worked well and I've canceled it in favour of yours, so there's no new credit ballooning about."

Ric Ford

In other words:
Arwa Mahdawi said:
Apple’s ‘sexist’ credit card isn’t just a PR problem – it’s a nightmare for us all
... Like God, algorithms often work in mysterious ways, making opaque decisions not even the program’s creators can understand. Machine-learning algorithms don’t need to be told to take factors such as gender or race into account to make sexist or racist assumptions based on the biased data fed in. And this has worrying ramifications. Increasingly, every aspect of our lives, from our employability to our trustworthiness and our creditworthiness, is being influenced by these opaque algorithms. This isn’t just a PR problem for Apple and Goldman Sachs, it’s a nightmare for us all.

You want a nightmare? Google is getting into banking. They already know about all your shipments by skimming your emails, they might as well know how you are paying for all those items.
Reuters said:
Google Pay to offer checking accounts through Citi, Stanford Federal
The project, codenamed Cache, comes as rivals Facebook Inc (FB.O) and Apple Inc (AAPL.O) are expanding their own efforts in consumer finance, a broad area that ranges from digital payment apps to bank accounts, brokerage accounts and loans, and which offer Silicon Valley new sources of revenue and new opportunities to strengthen ties with users.

U.S. regulators and lawmakers have expressed concern about how those companies’ massive influence and poor records on data privacy will play out as they try to gain ground in finance. The scrutiny most recently prompted Facebook’s partners to pull back from plans to support the launch of a digital currency.

I wanted to mention [re the discussion] that my wife is consistently rated a couple of points better than me, the actual breadwinner, on both Equifax and Transunion....

Wait a minute, if they share all financial accounts why does she have a higher credit score?
If I recall correctly from an interview clip I saw, even though he has money, he doesn't pay some of his bills on time (perhaps due to distraction). They may have dual-listed major asset accounts, but some of the bills/liabilities probably have just one name on them – the Apple Card certainly does. There may also be a component from starting lower, too.

Ric Ford

More about fundamental issues that have surfaced with Apple's credit card:
Wired said:
The Apple Card Didn't 'See' Gender—and That's the Problem
The way its algorithm determines credit lines makes the risk of bias more acute

... The response from Apple just added confusion and suspicion. No one from the company seemed able to describe how the algorithm even worked, let alone justify its output. While Goldman Sachs, the issuing bank for the Apple Card, insisted right away that there isn't any gender bias in the algorithm, it failed to offer any proof. Then, finally, Goldman landed on what sounded like an ironclad defense: The algorithm, it said, has been vetted for potential bias by a third party; moreover, it doesn’t even use gender as an input. How could the bank discriminate if no one ever tells it which customers are women and which are men?

This explanation is doubly misleading. For one thing, it is entirely possible for algorithms to discriminate on gender, even when they are programmed to be “blind” to that variable. For another, imposing willful blindness to something as critical as gender only makes it harder for a company to detect, prevent, and reverse bias on exactly that variable.

#AI #bias

Amazon disclaimer:
As an Amazon Associate I earn from qualifying purchases.

Latest posts