No More Excuses | Goals First #6 | Plutus
Listen to this episode
Summary
In the first of a weekly live show, Plutus discuss's excuses. He describes the difference between fake excuses and real excuses but more importantly why they are simply just "Excuses". Plutus brings up his app startup and own plans for success along with a desire to share what he learns and for the listeners to join him in his journey of putting their Goals First.
First published
12/17/2020
Genres
Duration
20 minutes
Parent Podcast
Goals First
View PodcastSimilar Episodes
-
Taking a Small First Step toward Bigger Goals, with Anthony Chen, Host of Family Business Radio
02/12/2024
Taking a Small First Step toward Bigger Goals, with Anthony Chen, Host of Family Business Radio In a commentary from a recent Family Business Radio episode, host Anthony Chen discussed the value of taking a small first step when you have bigger goals in mind. Anthony's commentary was taken from this episode of Family Business Radio. Family Business Radio is […] The post Taking a Small First Step toward Bigger Goals, with Anthony Chen, Host of <iandgt;Family Business Radioandlt;/iandgt; appeared first on Business RadioX ®.
Clean -
#27 Why Write Goals | Gentlemen’s Brawl
08/28/2015
Why write goals in the first place? The goal writing process is the first step in actually achieving the goal. Writing the goal makes it real. The post #27 Why Write Goals | Gentlemen’s Brawl appeared first on Courage Hub.
Clean -
AGI safety from first principles: Goals and Agency by Richard Ngo
11/17/2021
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: AGI safety from first principles: Goals and Agency, published by Richard Ngo on the AI Alignment Forum. The fundamental concern motivating the second species argument is that AIs will gain too much power over humans, and then use that power in ways we don’t endorse. Why might they end up with that power? I’ll distinguish three possibilities: AIs pursue power for the sake of achieving other goals; i.e. power is an instrumental goal for them. AIs pursue power for its own sake; i.e. power is a final goal for them. AIs gain power without aiming towards it; e.g. because humans gave it to them. The first possibility has been the focus of most debate so far, and I’ll spend most of this section discussing it. The second hasn’t been explored in much depth, but in my opinion is still important; I’ll cover it briefly in this section and the next. Following Christiano, I’ll call agents which fall into either of these first two categories influence-seeking. The third possibility is largely outside the scope of this document, which focuses on dangers from the intentional behaviour of advanced AIs, although I’ll briefly touch on it here and in the last section. The key idea behind the first possibility is Bostrom’s instrumental convergence thesis, which states that there are some instrumental goals whose attainment would increase the chances of an agent’s final goals being realised for a wide range of final goals and a wide range of situations. Examples of such instrumentally convergent goals include self-preservation, resource acquisition, technological development, and self-improvement, which are all useful for executing further large-scale plans. I think these examples provide a good characterisation of the type of power I’m talking about, which will serve in place of a more explicit definition. However, the link from instrumentally convergent goals to dangerous influence-seeking is only applicable to agents which have final goals large-scale enough to benefit from these instrumental goals, and which identify and pursue those instrumental goals even when it leads to extreme outcomes (a set of traits which I’ll call goal-directed agency). It’s not yet clear that AGIs will be this type of agent, or have this type of goals. It seems very intuitive that they will because we all have experience of pursuing instrumentally convergent goals, for example by earning and saving money, and can imagine how much better we’d be at them if we were more intelligent. Yet since evolution has ingrained in us many useful short-term drives (in particular the drive towards power itself), it’s difficult to determine the extent to which human influence-seeking behaviour is caused by us reasoning about its instrumental usefulness towards larger-scale goals. Our conquest of the world didn’t require any humans to strategise over the timeframe of centuries, but merely for many individuals to expand their personal influence in a relatively limited way - by inventing a slightly better tool, or exploring slightly further afield. Furthermore, we should take seriously the possibility that superintelligent AGIs might be even less focused than humans are on achieving large-scale goals. We can imagine them possessing final goals which don’t incentivise the pursuit of power, such as deontological goals, or small-scale goals. Or perhaps we’ll build “tool AIs” which obey our instructions very well without possessing goals of their own - in a similar way to how a calculator doesn’t “want” to answer arithmetic questions, but just does the calculations it’s given. In order to figure out which of these options is possible or likely, we need to better understand the nature of goals and goal-directed agency. That’s the focus of this section. Frameworks for thinking about agency To begin, it’s crucial to distinguish between ...
Clean -
Breaking Down Goals
03/30/2023
Setting goals for yourself is a big task, even bigger is actually accomplishing those goals. When a goal is large it can seem overwhelming and unachievable.Read more ›The post Breaking Down Goals appeared first on Complete Developer Podcast. Hosted on Acast. See acast.com/privacy for more information.
Clean
Similar Podcasts
-
PODCASTS - WELCOME TO HILLSIDE
08/12/2020
First Evangelical Free Church of Tahlequah
Come hear the Word of God preached at EFreeTahlequah anywhere in the world.
Clean -
Mama Bear Apologetics
08/12/2020
Hillary Morgan Ferrer & Amy Davison
Mama Bear Apologetics is a podcast for mothers of biological, adopted, or spiritual children who want to learn about how to defend the Christian faith, help give their children reasons for faith, and understand the worldviews that challenge Christian faith in the first place.
Clean -
Rock The Walls
08/12/2020
idobi Network
Always on the frontlines, Rock The Walls is hosted by music fan and devoted radio host Patrick Walford. Over one thousand interviews are already in the can—from being the first ever radio interview for bands like I Prevail & The Story So Far, to speaking with heavy & alternative music legends such as The Used, Anthrax, Parkway Drive, Godsmack, Korn, Sum 41, Bring Me The Horizon, A Day To Remember, and hundreds more.After doing the show for over a decade, hosting Warped Radio, bringing you your idobi Music News, and Music Directing idobi Howl, along with hitting the road for coverage on the Warped at Sea Cruise in 2017 & the final Vans Warped Tour in 2018, Walford is a long trusted voice in the music scene. Tune in to hear in-depth interviews you won't hear anywhere else with all your favorite heavy & alternative artists, along with spinning the best in new music.
Clean
Comments
Sign in to leave a comment.
No comments yet. Be the first to comment!